News this Week

Science  18 Oct 2002:
Vol. 298, Issue 5593, pp. 510

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    What to Do When Clear Success Comes With an Unclear Risk?

    1. Eliot Marshall

    An expert panel, meeting in an emergency session last week, urged the U.S. Food and Drug Administration (FDA) to lift a hold it had placed on three gene therapy trials after a patient treated with gene therapy in France developed cancer (Science, 4 October, p. 34). The panel made the recommendation after concluding that the cancer was almost certainly caused by the gene therapy. For safety reasons, the United States, France, and Germany have suspended clinical trials that use the same gene-transfer technology. But the United Kingdom has not, leaving it up to clinicians and patients to weigh the risks and benefits. FDA's advisers seem to favor the British approach.

    Chaired by molecular biologist Daniel Salomon of the Scripps Institute in La Jolla, California, the FDA panel confirmed what many had feared: A 3-year-old boy in the French trial has developed cancer that probably was caused by a modified retrovirus that was used to shuttle healthy genes into his cells. Yet panel members also recognized that the trial resulted in the only unequivocal success for gene therapy so far. Alain Fischer and his colleagues at the Necker Hospital for Sick Children in Paris have treated 11 children with severe combined immunodeficiency (SCID), a disease that often causes children to die from infections before they are 1 year old. Nine now have sound immune systems. Because the benefits seem clear and the risks are poorly understood, the panel agreed that the research should go on but with strict monitoring of therapies that involve retroviruses.

    The evidence linking the boy's cancer to the retrovirus used to treat him came from the French team itself, working with Christof von Kalle, a molecular biologist now at the University of Cincinnati Children's Hospital in Ohio. Von Kalle discovered the problem, he explained in an interview, because “we were trying to follow the healthy cells” of young patients. His analysis showed that eight of the patients were doing well. But an anomaly turned up earlier this year in patient number four, who had received therapy at the age of 1 month.

    Thirty months after therapy, in spring 2002, this boy had a high concentration of a particular type of immune cell (gdT cell) in his blood. Doctors initially thought this was a “sampling error,” von Kalle explained. But by late August, the boy had anemia and an enlarged spleen. The cell count for the anomalous T cell shot up by 2 September to 300,000 cells per microliter of blood. Clinicians began giving him a type of chemotherapy used for T cell leukemia, while alerting health officials in France and other countries. Since September, the child's unhealthy T cell count has come down, but no one knows what course this unique disease will take.

    Meet the press.

    FDA's Philip Noguchi (top) and biologist Christof von Kalle talk to reporters after panel meeting.


    Stored blood samples revealed that the patient's explosive T cell growth likely began sometime between the 13th and 17th month after therapy. Von Kalle analyzed the errant T cells—a clonal outgrowth of a single treated cell—and found that they included the sequence of the retrovirus vector and the new curative gene it transported. But he also found something that left the FDA panel chair “scared”: The foreign DNA had inserted itself, in reverse, in the initial coding region of a gene (LMO2) essential for the early development of blood cells. More than a decade ago, researchers tied aberrant expression of this gene to leukemia. Von Kalle described one additional anomaly that appeared in these T cells: Part of chromosome 6 was duplicated and attached to chromosome 13.

    A few members of the FDA panel argued that gene therapy shouldn't get all the blame for triggering the unhealthy T cell growth. Some found it hard to believe that a short, reverse-oriented DNA insertion would have such a devastating effect. And, as von Kalle noted at the hearing, the boy got a chickenpox infection just before his T cell count soared, and a sibling and a distant relative had cancer in childhood. But oncologist Linda Wolff of the National Cancer Institute and retrovirus expert John Coffin of Tufts University in Medford, Massachusetts, both described how, in animals, retrovirus insertions can dramatically change the expression of genes—even distant ones.

    Stuart Orkin of the Dana-Farber Cancer Institute at Harvard University in Boston then read a warning from a report he had co-authored in 1995, noting an inherent risk of leukemia in retrovirus-based gene therapy. He said that there are “potentially numerous sites within the genome that could contribute to leukemia,” adding that the more he learns about the genome, the more possibilities he finds. In summing up, Salomon said there is no avoiding it—the most successful gene therapy trial also appears to have been the first to induce cancer.

    Salomon and other panel members said FDA should ask clinics to step up their monitoring of patients who have been treated with retroviruses. FDA estimates that about 300 clinical trials have provided therapy using retroviruses and 150 are still active. The panel also recommended that SCIDpatients be excluded from this therapy if they can get a bone marrow transplant from a matching (HLA-identical) donor, and that clinicians warn volunteers that retrovirus therapy can cause cancer. “We should be absolutely clear,” Salomon said: “This shouldn't be a line buried in acres of text.”

    FDA usually follows advice from such panels. Philip Noguchi, the agency's gene-therapy specialist, said he thought the panel had reached a “remarkable consensus” on several points. He said that FDA plans to request “some modifications” and ask clinicians using retroviral vectors to notify participants in their study of the leukemia found in this case and revise their consent forms to include this information. But he couldn't say when the trials might resume.

    One panel member—Abbey Meyers, president of the National Organization for Rare Disorders—made a pitch for placing all retrovirus-based trials on hold because no one can judge the risks. But her message didn't carry as much emotional weight as another advocate's. A woman who identified herself only as a grandmother of a SCID child rose from the audience to ask that the trials continue. Her grandson, she said, has failed bone marrow transplantation four times and has been waiting 3 years to be enrolled in a trial, now on hold, at the National Institutes of Health. The FDA panel paid heed.


    Panel Prescribes Study to Treat Growing Pains

    1. Jeffrey Mervis

    Call it tough love. Last week a U.S. House spending panel approved a 13% increase for the National Science Foundation (NSF), putting it on course for a doubling of its budget in 5 years. But the committee, concerned that the agency might not be ready to handle such an infusion, asked an outside group of management experts to delve into how NSF does its business. The review is expected to question some well-worn practices at the 52-year-old agency, including borrowing many of its managers from academia.

    The House Appropriations Committee approved a 2003 budget for NSF of $5.42 billion. That's $70 million more than its Senate counterpart approved in July (Science, 2 August, p. 755) and $394 million more than the Bush Administration requested for the new fiscal year, which began 1 October. Although Congress is currently mired in a budgetary morass, the similarity of the House and Senate numbers augurs well for NSF. “It's a historic time,” says Director Rita Colwell about the congressional vote of confidence.

    Within that overall boost, both NSF's research and its major facilities accounts would get 15% hikes, with the House adding $26 million to finish a high-altitude environmental research plane and $25 million for a neutrino experiment beneath the South Pole. Education programs would get only the requested 4% rise, although the panel took $40 million from the $200 million sought for math and science partnerships and distributed it among several smaller programs. The overall NSF number is very close to the 15% annual rate needed to double the agency's budget over 5 years, a cherished goal of community lobbyists.

    Management model.

    NSF's Rita Colwell “welcomes” review of agency practices


    With the agency about to march off in double time, legislators are asking the National Academy of Public Administration (NAPA) to see if NSF is ready for the journey. “We're not criticizing them, but we want to be sure they can handle the growth,” says one congressional aide. Looking at recent budgets, legislators wonder if NSF has gorged itself on top-down cross-disciplinary initiatives in information technology, nanotechnology, and biocomplexity while starving individual fields, in particular physics, chemistry, and astronomy. Those disparities, says a report accompanying the spending bill, could undermine a time-tested precept that “the choice of research priorities and individual projects should flow principally from practicing scientists … through external peer review.” Notes another aide, “A lot of NSF's budget is broken down into tiny pieces, with the chunks carved up at the top. Is that the best way to stay at the cutting edge of science?”

    The report language also expresses concern about NSF's extensive use of scientists borrowed for a few years from somewhere else, usually a university, to fill positions at all levels. NSF officials believe strongly that such rotators, who make up almost 40% of NSF's 600-person scientific work force, represent new blood and also spread the word about NSF after returning to their home institutions. But the result might also be staff members “who have less experience and could have split loyalties between their federal roles and past or future employers,” says the report. Legislators are especially concerned about the prevalence of rotators at the top: The heads of five of NSF's seven research directorates are currently on temporary assignments. (There's a search on for a sixth chief.)

    Colwell says NSF “welcomes the attention” from NAPA or any other group asked to look at its management acumen, although she insists that the agency “is already seen as a model organization” within the federal government. And she strongly defends NSF's personnel practices. “It's a constant renewal of ideas and views,” she says about the use of rotators, who typically stay for 2 to 4 years. The NAPA study, which can't start until after NSF's 2003 budget is approved, is expected to take a year or so.


    Plans for Pluto and Hubble Gain in Congress

    1. Andrew Lawler

    Pluto was the Roman god of the dead, but a $488 million mission to his planetary namesake is very much alive. Last week, a U.S. House spending panel brushed aside objections by the Bush Administration and agreed to a Senate plan to continue funding the effort. The decision, coupled with a National Research Council report this summer that backs exploration of Pluto and the nearby Kuiper belt, virtually ensures that the controversial mission will move forward.

    Pluto's kiss of life came from the House Appropriations Committee, which voted to boost NASA's 2003 budget by $400 million over this year's $14.9 billion. That's $300 million more than the Administration requested, although most of that will go to projects requested by individual legislators. Within science programs, the bill increases funding to explore Mars, asks NASA to consider extending the life of the Hubble Space Telescope, and chides the agency for backing away from materials science research on the space station.

    The Pluto mission has been unpopular with both the Clinton and Bush administrations, which would prefer that NASA work on advanced propulsion systems that might eventually provide a faster trip to Pluto and the Kuiper belt. But Senator Barbara Mikulski (D-MD) has fought hard to restore funding, and last week the House panel matched the $105 million approved for Pluto by a Senate panel in late July (Science, 2 August, p. 755) for the 2003 fiscal year that began on 1 October. There's another $15 million for it as part of a new series of low-cost missions, leaving the project just $2 million shy of the amount supporters say is needed to keep it on track for a 2006 launch. Although Congress has fallen far behind in passing spending bills for the new year, the similar funding levels in both houses for the Pluto mission virtually ensures its continuation.

    Back from the dead.

    A NASA mission to Pluto, shown arriving in 2015, got a boost last week from Congress.


    A 2006 launch would deliver the spacecraft to Pluto by 2015 and to the Kuiper belt by 2026. The date was picked to stay ahead of a projected freezing of the planet's thin atmosphere as it moves away from the sun, although last week astronomers reported new data suggesting that Pluto's atmosphere in fact might be warming rather than cooling. Administration officials say privately that a 2006 launch might be impossible because the spacecraft's nuclear-electric power system requires a complex approval process and a new launch vehicle might not be ready. But project manager Tom Coughlin of the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, insists that neither problem is a showstopper.

    The House panel also provided a $20 million boost to NASA's Mars exploration program to cover rising costs in planned robotic and orbiter missions. It also urged NASA to ignore the advice of a recent controversial report that ranked some materials science as a low priority aboard the space station. At the same time, lawmakers decided that equipment shortages on the orbiting base preclude a proposed $11 million biology project called Generations.

    As for the Hubble, the committee told NASA to study an extension beyond 2010, when the instrument was scheduled to return to Earth aboard the space shuttle. Committee members fear that delays in launching the newly named James Webb Space Telescope could leave astronomers with a viewing gap. The new scheme would require an additional Hubble servicing mission in 2007.


    Secret Weapons Tests' Details Revealed

    1. Martin Enserink

    Documents released last week by the Pentagon about secret biological and chemical weapons tests have fueled the anger of veterans who say they were used as unwitting guinea pigs. But biological and chemical arms experts say that there are no major revelations in the documents—although they do illustrate the vast scope of the U.S. chemical and biological warfare program at the height of the Cold War.

    The information was released as the Pentagon tries to document a series of 134 chemical and biological warfare studies that were planned in the 1960s. The tests came to light as the result of pressure from worried veterans—some of whom blame health problems on exposure to test agents—and members of Congress. Many of the trials were never carried out, but at least 46 trials took place, acknowledged William Winkenwerder, assistant secretary of defense for health affairs, last week at a press briefing. The newly released material pertains to 27 of them. The Department of Defense intends to produce and post online detailed fact sheets about all of the tests by next spring (

    The papers document a wide-ranging effort to study biological and chemical weapons—including bacteria, nerve agents, and at least one agricultural pest. The tests were conducted from Florida to Alaska, on islands under U.S. jurisdiction and in Canada and the United Kingdom. Several of the tests involved simulants, agents resembling the real thing but considered harmless, such as Bacillus globigii, a bug now classified as a strain of Bacillus subtilis that is a close relative of the anthrax bacterium.

    But real pathogens and toxic chemicals were used in more than 20 of the tests revealed so far. In an operation dubbed Shady Grove, for instance, the U.S.S. Granville S. Hall and five army tugboats in the Pacific were sprayed with two species of bacteria: Francisella tularensis, which causes tularemia, and Coxiella burnetii, the cause of Q fever. Both microbes can cause severe and potentially fatal infections.

    U.S. Navy and Army crew members involved in this test “should have been fully informed of the details,” according to the fact sheet, and “should have worn appropriate … protective equipment.” But during a Senate committee hearing last week, retired Navy commander Jack Alderson, who participated in Shady Grove, testified that he was never told about the test's purpose and that no protective materials were issued. Officials say test records contain no evidence that anyone got sick, although it's not clear whether the microbes caused no infections or whether those infected were successfully treated with antibiotics.

    More than a dozen tests used nerve agents, including sarin and the extremely lethal VX. In these tests—designed to show, among other things, how well the agents dispersed under various climate conditions and whether they clung to ships, clothing, or the ground—far fewer participants were involved, and they wore protective gear, according to the Pentagon.

    Other trials illustrate the nation's broad interest in biowarfare. In operation “Magic Sword,” Aedes aegypti mosquitoes, which can transmit yellow and dengue fevers, were released off the coast of Baker Island, a U.S. atoll in the North Pacific, to work out the logistics of mosquito-borne viral attacks. (The mosquitoes weren't infected, and they were eradicated after the exercise.) And in an experiment in Florida, the Army used a plane to spray a fungus that causes a devastating disease called stem rust. The goal was to see whether it reduced crop yields in test plots.

    The Pentagon is trying to track down and inform more than 5000 people involved in the tests. So far, more than 50 veterans have filed claims with the Department of Veterans Affairs (VA) because they believe they're suffering from conditions triggered by the tests. But unless the vets share some common set of symptoms—which VA says is not the case—it will be next to impossible to link specific complaints to the tests, says Harvard biologist and arms control expert Matthew Meselson. The Institute of Medicine has just begun working on a $3 million study funded by VA that will compare health status and mortality among test participants to that of a control group of veterans.

    Meanwhile, biological and chemical arms experts are scouring the documents for details about the U.S. program, which President Nixon ended in 1970. But most say there's little new information. An unclassified Army document published in 1977 confirmed that field tests with biological agents had taken place, says Meselson, who's surprised that the fact sheets have triggered so much publicity. “I guess the media tends to forget these things,” he says. Still, the stream of documents illustrates the surprisingly large scale of the research program, says Jonathan Tucker of the Monterey Institute's Center for Nonproliferation Studies in Washington, D.C.

    The revelations also serve another, unintended purpose, says Leonard Cole of Rutgers University in Newark, New Jersey, author of a book about previously revealed Army experiments on unwitting subjects. They serve as a reminder to authorities not to conduct experiments—even those in the national interest—without first obtaining informed consent from the participants.


    Into Painless Piercing? Try It With Microwaves

    1. Mark Sincell*
    1. Mark Sincell writes from Houston, Texas.

    Anyone unfortunate enough to remain awake in the dentist's chair may be acutely aware of at least two of the three primary drawbacks to using a spinning mechanical drill to grind a hole: noise, vibration, and flying debris. The drill bit feels the pain, too, eventually wearing out or breaking under the repeated stress. Now, a Tel Aviv University team led by mechanical engineer Eliyahu Jerby reports on page 587 of this issue that it has developed a drill that uses microwave energy to excavate solids. The new microwave-powered drill suffers from none of the problems that plague mechanical drills. It is silent, steady, and dust-free, and the bits almost never wear out.

    Drilling with electromagnetic radiation is nothing new. For years, engineers and scientists have been using the tightly focused light beams from laser drills to punch tiny holes as small as 1 micrometer in everything from semiconductor circuit boards to human bone. But laser drills are expensive, and a several-hundred-thousand-dollar laser drill might not always be the right tool to quietly put a 1-millimeter-wide hole in a concrete block.

    So Jerby's team cooked up a low-cost alternative in the kitchen. “We pulled the magnetron from a domestic microwave oven,” Jerby says. “It cost about $20.” To focus the microwaves, radiation from the magnetron is directed into a rectangular metal box that guides the microwaves into one end of a piece of coaxial cable—“just like the cable going to your TV, except ours is a little stiffer,” Jerby explains. The other end of the cable is placed near the surface where the hole will be drilled.

    Holier than thou.

    Microwaves promise clean, silent drilling at a fraction of the cost of lasers.


    By adjusting a mirror at one end of the metal box, the researchers can match the impedance of the coaxial cable to the surface being drilled. That tuning allows microwave energy to travel into the surface instead of being reflected, thus concentrating the energy of the microwaves into a spot just below the surface. As the spot starts to heat up, changes in the material cause a peculiar thing to happen: Instead of cooling more rapidly to shed the excess heat, the spot starts to soak up even more energy than before. A molten hot spot forms beneath the surface of the material, and a drill bit passing down the center of the coaxial cable can easily scoop out the molten material.

    Jerby's team has already used a prototype microwave drill to put holes with diameters ranging from about 1 millimeter to 1 centimeter in ceramics, concrete, basalt, glass, and silicon. Because regions near the hot spot stay relatively cool, even brittle materials don't build up enough thermal stress to shatter, Jerby says. “The cool thing is that you can drill without wear, breakage, or cracking the tool bit,” says electromagnetic scientist John Booske of the University of Wisconsin, Madison, who believes that the microwave drill will be particularly useful for drilling ceramics such as those used to mount semiconductor devices on a circuit board. “It would also be great for drilling jewelry and pottery,” says Booske. Jerby adds another low-tech application of his silent drill: concrete construction. “If you have ever had a neighbor drill into a concrete wall next to your apartment at midnight, you know what I mean,” he says.

    But don't expect to see the microwave drill during your next dental checkup. Although the lab model emits less radiation than a typical home microwave oven, Jerby says, “safety is still a big concern.” To keep stray microwaves from cooking the internal organs of an unwary drill operator, production models of the microwave drill either will be completely enclosed, like an oven, or will use a shielding plate.


    Miscue Raises Doubts About Survey Data

    1. David Malakoff

    A misrigged trawling net has brought a haul of problems for the U.S. National Marine Fisheries Service (NMFS). The faulty net has been used for the past 2 years in NMFS surveys of Atlantic fish populations, which help regulators set catch limits for cod and other important species. Now, some commercial fishers and members of Congress want the government to delay controversial catch restrictions that they say might be based on flawed data.

    The controversy, which some have dubbed Trawlgate, was triggered last month when NMFS officials disclosed that the 1000-meter-long cables aboard the government research vessel Albatross IV were mismarked. The cables are supposed to carry marks every 50 meters, so that researchers can repeatedly pull trawl nets evenly across the bottom in annual efforts to track population trends. But officials said that the uneven spacing caused one cable to be as much as 2 meters longer than the other during typical tows, in which the net is lowered 70 to 250 meters. That could make the trawl lopsided and possibly reduce catches. The admission, prompted by a tip from a commercial fisher who 2 years ago noticed contractors misapplying the marks, produced a hailstorm of criticism from fishing groups. NMFS quickly invited six critics on a 3-day cruise that examined the troubled net with underwater video cameras and called a 2-day summit between scientists and fishers. On 3 October, the two sides reported that the error had an as-yet-undetermined “effect” on at least eight surveys over the last 2 years.

    All wet?

    Critics say mismarked trawl net (top) might have biased fish population counts.


    Independent researchers say the scientific impact of the misrigging is likely to be minor. But the mishap has accelerated efforts to overhaul the 60-year-old Atlantic survey program, which senior NMFS researchers at the summit described as “broken.” Government officials and commercial fishers are already discussing ways to gather more and better data by using upgraded government equipment and getting more help from commercial trawlers.

    Until such improvements are in place, some critics say the government should drop plans to help some stocks recover from decades of overfishing by limiting catches in New England and elsewhere. A federal court, for instance, has ordered New England regulators to cut catches by one-third or more by next August (Science, 17 May, p. 1229), a deadline Representative Bill Delahunt (D-MA) now wants the judge to delay for up to 2 years. “Given the documented shortcomings of the research, the only sensible course is to pause for a deep breath,” he says. NMFS officials, however, note that almost none of the potentially flawed data were used in formulating the recovery plan, and they say it should move ahead.

    Government fisheries researchers, meanwhile, hope that the painful glitch will bolster their push for better—and better funded—stock-assessment efforts. “We've been wanting to make improvements for a while,” says fisheries scientist Russell Brown of the Northeast Fisheries Science Center in Woods Hole, Massachusetts. “We just didn't expect to have to do it in this kind of charged atmosphere.”


    New Stem Cell Fund Raises Hackles

    1. Gretchen Vogel

    A tidy sum of money gift-wrapped for stem cell research has sparked recriminations and soul-searching in Sweden, a country at the vanguard of the hot young field. Several prominent scientists have charged that a new 75 million Swedish kronor ($8.1 million) stem cell fund has subverted the country's rigorous peer-review process by awarding large amounts of money to teams with sparse track records that have jumped on the stem cell bandwagon. If the winners “had applied [for grants] in a broader competition, the results would have been very different,” asserts Helena Edlund, a developmental biologist at the University of Umeå, who resigned in protest last week from the Swedish Research Council's (SRC's) medical advisory board.

    The controversy has triggered a broader debate about the wisdom of focusing scarce funds on narrow—some say trendy—areas of science. “A country such as Sweden should be careful [that it doesn't] waste its limited resources for basic research funding,” says Jan Lundberg, head of global research for the drug giant AstraZeneca, based in Södertälje. “It's very important that grants are given to the most strategic areas of research and not flooded into areas that happen to be hot at the moment.” The new stem cell program might seem minuscule by U.S. standards—indeed, much of its funding comes from abroad—but after a decade of stagnation for Sweden's research budget, critics argue, every award counts. Grantees contacted by Science defend their projects but welcome a wider discussion, one that should resonate in many small nations with vibrant scientific communities.

    Widely regarded as a stem cell pacesetter, Sweden is home to 25 of the 64 embryonic stem cell lines that U.S. President George W. Bush approved for research in August 2001, although all but two of the Swedish lines remain deep-frozen and untested (Science, 9 August, p. 923). In the field of adult stem cell plasticity, Jonas Frisén and his colleagues at the Karolinska Institute in Stockholm have found that stem cells from the brain may be capable of becoming many different cell types. And Anders Björklund and Olle Lindvall of Lund University are pioneers in fetal cell transplants, a basis for possible stem cell therapies for Parkinson's disease.

    Caught in a firestorm.

    SRC biomedical chief Harriet Wallberg-Henriksson defends the stem cell grants, pointing out that they consume a tiny fraction of the council's budget.


    Last fall, a report by the Boston Consulting Group advised a coalition of research foundations to capitalize on these head starts and funnel more funding into the stem cell field. That message pleased Swedish politicians, who were well aware of the field's potential. The Bush announcement “triggered enormous hype here in Sweden,” says Thomas Edlund, a developmental neuroscientist at the University of Umeå and Helena Edlund's husband. “Suddenly everyone was saying we should go for it. This should be the future in Sweden.”

    That aura attracted the Juvenile Diabetes Research Foundation (JDRF). Last winter the charity, based in New York City, offered roughly $1 million a year for 5 years to SRC for stem cell research. JDRF hopes such work will lead to stem cells that can turn into pancreatic cells for transplantation. It required SRC to match the gift with $434,000 a year of its own funding. (The Swedish Diabetes Association chipped in another $100,000 a year.)

    Coordinated by SRC, the new stem cell fund issued a call for proposals in March, then put the submissions through a review by five international stem cell experts who ranked the top proposals and recommended funding levels. Last month, the fund announced the winners of its first 11 awards, worth a total of $4.8 million over 3 years. The two largest grants involve research networks among several institutions: one on somatic stem cell plasticity, the other on characterizing the frozen embryonic stem cell lines and deriving new ones.

    Thomas Edlund contends that many of the teams that won grants—either as individuals or as part of networks—would not have fared as well if their grant proposals had had to go up against proposals from other fields, as is the case for most funds distributed by SRC. He sent an open e-mail to colleagues across Sweden on 9 September complaining about the program and its results. The fund has “created a system that can give out huge grants to groups that don't have publications in the field,” he told Science, pointing to the network that plans to characterize as-yet-untested embryonic stem cell lines. Echoing that view, Christer Betsholtz, a molecular biologist at Göteborg University, who collaborated on one funded project, says that “too much freedom” was given to the fund's select review committee. “My opinion is that they didn't do a very good job,” he says.

    SRC officials defend the review and the agency's decision to accept earmarked funds. Harriet Wallberg-Henriksson, secretary- general of SRC's medicine division, says she has no complaints about the review panel's work. She also points out that the stem cell funds are a tiny fraction—about 1%—of SRC's biomedical research budget and were taken from a $2.3 million pot set aside for bioscience and biotechnology.

    But even supporters of the stem cell program, and a few of the grantees, welcome the broader debate. “There's a danger that research is becoming like fashions that change every year,” says Patrik Brundin of Lund University, who received $150,000 over 3 years to study the plasticity of neural stem cells. Money poured into trendy fields, he says, can lure scientists away from worthy, but less flashy, research. In that respect, SRC's stem cell fund is not the only program making waves. The Swedish Strategic Fund's selection process for six “networks of excellence” awarded in June has come under attack for supporting “trendy” fields. Two networks involve stem cell-related research.

    Scientists on both sides of the debate agree that a dose of reality is prudent for a young field with uncertain prospects. Brundin notes that fields such as gene therapy and xenotransplantation drew loads of attention and funding early on. When they ran into difficulties, he says, “the balloon popped” and they quickly faded.

    Likewise, some observers contend, it could be hazardous for a small country such as Sweden to shower money on stem cell research. “Giving overwhelming support to this highly speculative research area is risky,” Lundberg says. Because the grantees can now attract the country's top postdocs and students, “it will have long-term consequences for the future of science in Sweden,” he says. “Do we want to be seen as the stem cell country of the world? I think we don't.” Others hope the new fund is making a wise wager that will keep Sweden at the table of one of the hottest games in town.


    Ice Man: Lonnie Thompson Scales the Peaks for Science

    1. Kevin Krajick*
    1. Kevin Krajick is the author of Barren Lands: An Epic Search for Diamonds in the North American Arctic. He lives in New York City.

    Glaciologist Thompson cores ice from the world's loftiest glaciers, seeking to retrieve precious records of ancient climates before they melt away

    QUELCCAYA ICE CAP, PERU—As he neared the 5670-meter summit of Quelccaya ice cap in the high Andes, Lonnie Thompson appeared to be suffering. The 54-year-old glaciologist was sinking calf-high in loose snow with each step, gasping in the ever-thinning air, and hacking in violent coughing fits brought on by his chronic asthma. Every 100 paces or so he shoved the metal haft of his ice ax into the snow and hung his head between his knees—“gets blood to the brain,” he explained. Then he straightened and started upward again, through dark clouds and whipping snow. “You know, in a way the weather today's a blessing,” he said cheerfully. “When the sun's out, it bounces off the snow and blisters the roof of your mouth.”

    Thompson drills ice cores from Earth's most daunting peaks, a job that has given him an increasingly pivotal voice in how scientists think about climate. Over 3 decades, he has employed yaks, mountaineers, and sheer will power to virtually create the field of high-alpine tropical paleoclimatology. He first defied experts simply by showing it was possible to get deep cores from high peaks. Then he extracted records suggesting that much of the world is climatically more volatile than previously believed, especially the tropics, long regarded as set apart from wrenching polar and temperate-zone upheavals. Now Thompson faces his greatest challenge: Thanks in part to his data, it is clear that mountain glaciers worldwide are melting—fast. Last year he made global headlines by observing that the famous snows of Tanzania's Mt. Kilimanjaro—actually 11,700-year-old ice (see Perspective on page 548 and Report on page 589)—may be gone by 2015. Thus he is on a mission, racing through expeditions to Asia, South America, and Alaska to retrieve endangered samples.

    Some researchers disagree with Thompson's interpretations, such as the exact timing and magnitude of ancient climate shifts, but all agree that his daring rescues of the ice and its records make him a rare old-fashioned pioneer. “He's the closest living thing to Indiana Jones, and just in time,” says Harvard University geochemist Daniel Schrag. Paleoclimatologist Wallace Broecker of Columbia University's Lamont-Doherty Earth Observatory places him “in the ranks of our great explorers. You can argue about his interpretations, but the thing that will last is his data.” Adds James Hansen, director of NASA's Goddard Institute for Space Studies: “If he wasn't doing it, we'd lose those records forever. He's a sort of hero.”

    From Gassaway to the Andes

    An unlikely sort. Pale and nondescript, unfailingly polite, quiet unless spoken to, Thompson looks like an encyclopedia salesman when he puts on a suit. In poor, rural Gassaway, West Virginia, his parents never made it past eighth grade; but Thompson loved science, setting up weather instruments in the barn and betting lunch money on the likelihood of rain. He met his wife, polar glaciologist Ellen Mosley-Thompson—now a partner in his work and top-ranked scientist in her own right—at their home state's Marshall University, where she was the lone female physics student. “I guess we both just grew up curious,” says Mosley-Thompson in her still-craggy mountain drawl. And practical: Thompson planned to be a coal geologist. “I hated poverty,” he says. “And West Virginia's full of coal.”

    Their horizons broadened during graduate work in the 1970s at Ohio State University (OSU) in Columbus, where he got a part-time job analyzing polar ice cores, a field then blossoming. Teams had recently drilled the first deep samples from Greenland and Antarctica, and founding fathers such as Willi Dansgaard of the University of Copenhagen were learning to study ancient precipitation, temperature, volcanic activity, and atmospheric composition from gases, isotopes, and other substances built up in thousands of yearly snow layers. The Thompsons were fascinated—“It started with atmospheric dust, but soon we were looking at all sorts of things,” says Mosley-Thompson—and both ended up working on polar drilling operations while working toward doctorates at OSU. Most people ignored the modest, low-latitude alpine glaciers, which presumably had melted, slid downhill, or otherwise eroded too much to hold good records.

    In thin air.

    Thompson and team drilled the highest core ever on Tibet's Dasuopu glacier.


    Because there are only a few polar drilling operations, competition for jobs was stiff. Mosley-Thompson advanced, becoming a senior investigator on Antarctic drilling projects and chairing the National Science Foundation's (NSF's) Greenland science planning committee by 1985. Thompson, however, felt outcompeted for choice work. His fallback: tales of an unusual ice mass in the remote Peruvian Andes that sounded promising. It had plenty of annual snow to form thick, readable layers; an elevation so extreme they probably stayed frozen; and a flat base of basalt, so they probably never slid off. This was Quelccaya. Its root in Quechua means “to inscribe.”

    Quelccaya was beautiful and dangerous. In 1974 Thompson and a colleague drove far above timberline to the end of a dirt track used by Quechua-speaking alpaca herders. Two days' walking over raw bedrock and rolling tundra brought them to the great ice dome, surrounded by giant moraines and boulder fields. Scattered herders ventured to the edge, but no recorded person had been on the ice itself; some native people see glaciers as purgatories for damned souls.

    Indeed, between their 4900-meter-high off-ice base camp and the 4-kilometer hike to the summit, Thompson discovered the terrors of high altitude: waking up in the black night feeling as if you were suffocating, because you are; searing headaches aspirin can't touch; 21,000-meter thunderheads that drive lightning and loose rocks horizontally over the ground; unfiltered sun that fries flesh like bacon. “I've never sought extreme adventure,” Thompson says earnestly. “It scares me. I only want the data.”

    He proposed drilling, but Dansgaard himself told NSF: “Quelccaya is too high for humans, and the technology [to drill it] does not exist.” But Thompson kept at it, and finally in 1979 NSF funded him to fly up a heavy polar-style drill and generator by helicopter—a venture that failed in the first hour, when the pilot veered dangerously in the thin air and declared the trip over.

    Thompson went home and talked University of Nebraska engineer Bruce Koci into designing a lighter drill that could be disassembled and powered by solar energy—the first such setup. Thompson slowly collected other collaborators and in 1983 mounted a new expedition with 40 mules, donkeys, and horses to ferry the drill parts to camp and 10 people to lug them over the ice—a low-tech approach that has become his trademark.

    Among the members was Koci, who has since designed drills for many polar programs; a high-altitude climatologist aptly named Keith Mountain; and a rock-tough Quechua-speaking mountain guide, Felix Benjamin Vicencio. The core team is still together after 20-some years—a secret to its success, all agree. “We've all lost enough skin and blood that no one needs to be told what to do,” says Mountain. “Something breaks, we fix it. Trouble comes, we know to get out of the way.”

    On top.

    Thompson's high-altitude adventures began near Peru's Qori Kalis glacier.


    Mosley-Thompson does not go on the alpine trips, but she is central to the science. She and Thompson are co-principal investigators on all projects, and together they pick sites, write proposals, and analyze data. When he is gone, she directs logistics and their shared lab, still at OSU. Because she works in Antarctica and Greenland, their field seasons are reversed—November to May for her, June to September for him—a perfect arrangement after their daughter Regina was born in 1976, because they could each travel for months, yet always have one parent home. “We both like being captain,” says Mosley-Thompson. “At home we have only two topics of conversation: our work and our family.”

    Core Skill

    On that 1983 Quelccaya expedition, Thompson's team spent 10 grueling weeks climbing daily from camp to the drill. At the end they had taken two cores to bedrock, some 160 meters down. Thompson was triumphant when the cores held what he had hoped: exquisite yearly dust and stable oxygen-isotope records documenting regional weather back to A.D. 470. It was nothing so vast as the 100,000-year-plus polar-ice archives, but it was still the first real tropical ice record. It showed major drying events indicated by thick swaths of dust and wet periods marked by thickened ice layers—probable indications of past El Niños.

    Most surprising was that the oxygen isotopes, which vary in precipitation according to temperature, showed major shifts matching the Medieval Warm Period of about 1050–1375 and the 300-year Little Ice Age, which ended around 1880. Before this, nearly everyone had thought that the tropics did not see such swings, because cores of oceanic plankton taken during the massive Climate/Long-Range Investigation Mappings and Predictions Project (CLIMAP) of the mid-1970s suggested they were largely stable even in glacial times, cooling only 1 or 2 degrees C at low elevations.

    Many researchers took intense interest. Archaeologists who had theoretically linked climate shifts with the rises and falls of pre-Inca civilizations were delighted; here were the shifts, recorded at the right times, including a 300-year drought starting around 1150 that many believe wrecked the Tiwanaku civilization around Lake Titicaca. “Quelccaya provided precision and time coverage we had never seen in this part of the world,” says archaeologist Izumi Shimada of Southern Illinois University, Carbondale. “It was the first, most crucial evidence that such climate events were real.”

    However, a single 1500-year core was not enough to change all scientists' views of ancient tropical climate—or of the importance of tropical ice cores in general. Many doubted that any tropical ice dated back to the last glacial age, the acid test for a long-term climate record. Thompson needed more and older cores. And over the next 15 years, he got them, in feats that were as much physical as scientific. Ice taken from the Guliya ice cap in remote western China in 1991 and 1992 by Thompson and Chinese colleagues had to be hauled across the Gobi Desert in ancient trucks; once when a drive shaft fell off, the team used ice cream to cool the cores.

    The Thompson Record



    Mt. Bona-Churchill, Alaska, 4300 meters


    Puruograngri Ice Cap, Tibet, 6072 meters


    Mt. Kilimanjaro, Tanzania, 5893 meters



    Dasuopu, Tibet, 7200 meters; Highest Core; student Shawn Wight dies


    Sajama, Bolivia, 6542 meters



    Huascarán, Peru, 6048 meters


    Guliya Ice Cap, China, 6200 meters


    Dunde Ice Cap, China, 5325 meters


    Quelccaya, Peru; 5670 meters; first deep core of a tropical glacier


    Mosley-Thompson earns Ph.D., OSU


    Thompson earns Ph.D., OSU


    First exploration of Quelccaya Ice Cap, Peru


    Thompsons graduate from Marshall University

    South American glaciers were equally daunting. Getting to the drill site on 6048-meter Huascarán, Peru's highest peak, meant dodging daily avalanches and crossing a 10-meter-wide crevasse on a narrow ladder. So in 1993 they hauled up 6 tons of equipment and camped next to the drill for 53 days straight—perhaps a world record for high-elevation living. A gale hurtled Thompson's tent, with him inside, toward a 3000-meter drop, until he jammed his ice ax through the floor. The drillers came back gaunt—and bearing 4 tons of 9-centimeter cores. In 1997 on Bolivia's 6542-meter Sajama, the indigenous Aymara people feared that drilling would anger mountain deities. So before ascending, Thompson and the others took part in a long ceremony that involved chanting, smoking cigarettes, drinking high-test alcohol, chewing coca leaves, and sacrificing a blindfolded white alpaca. Only 2 weeks after bringing down the Sajama cores, Thompson headed to Tibet to core the even higher Dasuopu glacier, at 7200 meters still the world's highest ice-core site (Science, 15 September 2000, p. 1916).

    There, disaster struck. Like everyone else, 26-year-old OSU graduate student Shawn Wight developed altitude sickness on the climb. But then Wight forgot Thompson's name, and Thompson sent him down the mountain, where he coughed up blood; Thompson took him to a hospital in Lhasa, but complications multiplied. Wight was shuttled to Cleveland, where doctors determined that he had picked up a bacterial infection. Wight died two and half months after leaving Dasuopu. His parents sued OSU for $21 million, claiming that safety procedures were inadequate and that Thompson had reacted too slowly. Three years of legal battles revealed that the group lacked equipment such as oxygen tanks and satellite phones, which Thompson said were too cumbersome and would have made no difference. “The mountains are dangerous,” he says. “You can reduce risk, but never eliminate it.” In 2001 an Ohio state court ruled that Thompson had acted properly, and dismissed the case, but the affair left him shaken. A lengthy article in The Chronicle of Higher Education (27 July 2001) asked “Research at What Cost?”

    Friends fear as well that Thompson pushes himself too hard. Once when he was in a Columbus hospital with bloody sputum and blue feet following a trip to Ecuador, a doctor discovered a minor heart-valve defect and advised him never to go to high altitudes again. That was 20 years ago. Six years ago he was also diagnosed with severe asthma, which worsens when he ascends. “It's controlled by drugs,” he insisted at Quelccaya, in between hacking bouts.

    Ice preserver.

    Ellen Mosley-Thompson in the cold room at Ohio State.


    Reading the record

    Despite the costs the cores kept coming, and in the mid-1990s the data, published in leading journals including Science, began to turn climate scientists around. According to several indicators, Huascarán was 19,000 years old (Science, 7 July 1995, p. 46)—well into the last glacial age—and Sajama, 25,000 years old (Science, 4 December 1998, p. 1858). The latter date was particularly hard to assail because it was based on carbon-14 analysis of plant matter and insects swept in by snowstorms—a rarity in ice cores.

    The oxygen isotopes suggested that South America had been 5 to 12 degrees C cooler back then. Dust concentrations 200 times above present levels showed it was far drier; low nitrate levels suggested the Amazon was probably largely deforested. The ice also showed big postglacial shifts including the dramatic cooling of the Younger Dryas period around 12,000 years ago. This was powerful evidence that the tropics, too, were swept by global climate changes. And similar findings were beginning to come in from paleotemperature records in fossil corals, Andean lake sediments, and Amazon groundwater. Many researchers reversed views and came around to the idea that the tropics are crucial for understanding the past and, therefore, the future.

    The Thompsons have “left little doubt about whether long-term climate phenomena are really global,” says Ted Scambos, a polar glaciologist at the National Snow and Ice Data Center in Boulder, Colorado. Thompson and others argue that the tropics may in fact drive climate; he points out that much of Earth's landmass, and thus much of its water cycling, resides there. Some speculate that ocean currents may transmit changes from pole to pole via the tropics, or perhaps vice versa, but it's still not clear what drives what.

    Debate continues as well over the exact interpretations of the ice data. Whereas Thompson claims that some Himalayan ice is very old—Guliya more than 500,000 years based on radiometric dating of chlorine isotopes—many are skeptical, because there is little supporting evidence. And some researchers, such as glaciologist Gerald Holdsworth of the University of Calgary, complain that Thompson picks sites “for novelty, not science.” He and others also question Thompson's interpretation of oxygen isotopes. This involves comparing the amount of 18O versus 16O in each layer and assuming that the heavier isotope is more common in precipitation during warm times, a method other glaciologists pioneered on polar cores. Few doubt that the isotopes represent some kind of temperature shift, but many specialists say that in the tropics the recipe may be affected by additional factors such as changes in moisture sources and seasonal precipitation shifts. “Overall, he's shown the tropics have big swings. It's just a matter of degree,” says Geoff Seltzer, a glacial geologist at Syracuse University who works in the Andes.

    The Thompsons are the first to admit that the picture is complex, with different regions showing shifts at different times or not showing them at all. But “it's unrealistic to think conditions will be the same everywhere at the same time. That's why we need cores from so many places,” says Mosley-Thompson. Recently other groups have entered the field, coring alpine glaciers in the Yukon, Bolivia, and Ecuador, and their studies so far tend to bolster the Thompsons'. “Eventually there will be a coherent picture,” says Thompson. “It's complicated, but it's knowable.”

    Thinning ice

    Thompson captured his colleagues' attention with the South American cores, but Mt. Kilimanjaro, Africa's highest peak, made him famous. In early 2000, his team arrived for a month-long coring expedition, toting what some regarded as a suicidal new toy: a specially designed hot-air balloon for lifting cores off the windy 5893-meter summit. The Tanzanian government barred the balloon from the mountain, so Thompson hired 92 porters to carry the cores down. The ice is nearly 12,000 years old, and parts of it appear to align with major climate events shown in other cores. These include a heavy dust band indicating a disastrous drought about 4200 years ago, which also shows up on Peru's Huascarán. Ancient records report that at this time, the Nile stopped flooding and crops failed.

    More immediately, in February 2001 Thompson announced that Kilimanjaro had lost 80% of its ice cover in the 20th century and predicted the ice would be gone in 15 or 20 years. This news appeared on the front pages of The New York Times and other newspapers, featured as a clarion call of drastic climate change. Thompson points out that the observation came from nothing more than archival images and his updated aerial survey. “All we did was connect the dots. The local people could have told you it's melting just as well as I can.” He says that Africa's handful of other glaciers have already melted so much that data probably cannot be extracted from them. “Kilimanjaro is the first, and probably the last ice core from Africa,” he mourns.

    The picture is the same for nearly all tropical and midlatitude glaciers. In the Himalayas, photos show that some 30 outlet glaciers on 6400-meter Puruograngri, which Thompson and Chinese colleagues cored in 2000, have retreated a kilometer or two since 1972. Highland Peru still has the world's largest concentration of tropical ice—1600 square kilometers—but Thompson estimates that's down from 2000 square kilometers in 1972. One study this year by glaciologist Mark Dyurgerov of the University of Colorado, Boulder, shows that the equilibrium-line altitude, below which glaciers generally lose mass, has risen about 200 meters worldwide in the last 40 years.

    Quelccaya, where Thompson started, is a prime example. Aerial photos show that it has shrunk more than a fifth, to 44 square kilometers, since his first trip 28 years ago. This July Thompson made his 17th visit, with old hands including Felix Vicencio and glaciologist Vladimir Mikhalenko of the Russian Academy of Sciences in Moscow, and a Science reporter new to high altitude.

    From their base camp, a stiff walk leads to a 150-ton granite boulder balanced on a nearby ridge. “First time I came, there was a whole valleyful of ice sticking 20 meters above this boulder, pushing on it,” Thompson remembers. He pats the rock like an old friend. Now it sits balanced over an 80-meter drop to a brand-new lake. The ice is 400 meters away on the opposite shore, cracking and groaning in the midmorning sun.

    Going, going …

    Thompson estimates that the snows of Mt. Kilimanjaro, seen in 1912 (bottom) and 2000, might be gone by 2015.


    The next day, a several-hour hike over rugged hills and desert takes the party to a wind-torn mesa overlooking Qori Kalis glacier—Thompson's favorite symbol of global climate change. This stupendous vertical ice mass pours into a wide valley from the icecap's edge, but it is wasting away. Survey photos show an exponentially increasing rate of retreat: 4.7 meters a year through 1978, and 205 meters in 2000–2001. “Sit here and you can practically watch it disappear,” says Thompson. He has augered shallow cores periodically from the icecap's summit and found that in the past decade melting has percolated through the top layers. “I expect in 20 years I'll come back and see the marks where our drill punched through to bedrock in '83,” he says. “Quelccaya will be gone.”

    All this has made the Thompsons into preachers. They tell anyone who will listen, from 5th-grade classes to Ph.D. colleagues, that humans are warming the Earth. “You have to be blind not to see the changes,” says Mosley-Thompson. Says Harvard's Schrag, “[They have] absolutely the most central evidence. … Nothing like this has happened for thousands of years. Many people have changed their tunes after seeing [their] results.”

    Melting ice may be not only a sign, but a disaster in itself. The loss of data is “like a great library on fire,” says Keith Alverson, director of the Bern-based nonprofit Past Climate Changes, which promotes paleoclimate research. Last month's collapse of a Russian glacier, which killed 120 people, may have been due to rapid melting; the United Nations has identified 44 new meltwater lakes in Bhutan and Nepal that may endanger communities downstream. And glacial meltwater supplies drinking water and hydropower to many nations such as Peru and Bolivia. “You deal with [climate change] or you adapt to it, but either way you have to know what's happening,” says Thompson.

    The Thompsons have a plan for finding out—they have their eyes on 13 sites worldwide to drill immediately. But it's a daunting list, including peaks in former Soviet republics close to ongoing wars; a 7400-meter Tibetan site even higher than Dasuopu; and Heard Island, a rarely visited speck in the southern Indian Ocean with weather so bad scientists would probably have to work and live under the ice surface. Government bureaucracies are slow with funds, says Thompson, and some ice may melt before he gets to it. So he is soliciting private donors such as media magnate Ted Turner and considering commercial endorsements for say, camping-equipment companies.

    His growing celebrity makes this suddenly sound feasible. Until recently, he had never won a big award; after the Kilimanjaro expedition, he was named fellow of the American Geophysical Union, Distinguished Scholar at OSU, and one of America's top scientists by CNN and Time. This year so far he has won the $150,000 Heineken Prize from the Royal Netherlands Academy of Arts and Sciences; the Swedish Vega Medal; and, along with Mosley-Thompson (and Julie Andrews and Fred “Mister” Rogers), the Common Wealth Award from PNC Bank. He is the hero of two new books, El Niño, by J. Madeleine Nash (Warner, 2002) and the forthcoming Thin Ice by Mark Bowen (Henry Holt, 2003). This summer he returned in triumph to Gassaway as grand marshal of the town's annual parade.

    Meanwhile the field work and its risks continue. On the third day at Quelccaya in July, Thompson wanted to hand-augur another short core from the summit, but blowing snow and clouds kept the crew in camp until 9:30 a.m. Then sun appeared and they headed up. They used to race up the ice in 45 minutes, but that was with weeks of acclimatization and 20-some fewer years of age. This day, hindered by snowfall, it took closer to 6 hours. When the top came into view through the bad weather, Vicencio and Mikhalenko, who had arrived first, were turning the light fiberglass drill like a giant corkscrew. Thompson fell on his knees beside Mikhalenko and started logging samples. Presently someone noticed the time. “We started too late,” cried Mikhalenko. They stuck the augur upright in the snow without finishing and scrambled to get down before dark; dark can kill in a place like this.

    An hour later Vicencio and Mikhalenko emerged from the clouds just in time to see the last glimmers of sun on lakes far below. Thompson was ahead this time and had already disappeared in the gloom. Everyone was headed for a potentially deadly precipice at the ice edge. Vicencio and Mikhalenko are experts, though; reporter in tow, they located the danger zone, skirted it, and descended onto solid ground. There was no sign of Thompson—and they were now in a boulder field laced with dangerous holes nearly invisible in the blackness. Suddenly several bright lights appeared about 100 meters away. It was Thompson. Although he had bashed one leg hard on an unseen rock, he had felt his way down a streambed, found camp, and rifled the tents for flashlights. Then he had come back for his friends.

    Next morning, the weather cleared, and Thompson headed straight back up to finish taking the core.


    Smart Weapons Prove Tough to Design

    1. Jennifer Couzin

    Gleevec demonstrated the power of targeted cancer drugs, but applying similar strategies to the treatment of common cancers hasn't been easy

    Physicians treating lung cancer grasp at every straw of optimism they can find, as the straws are few and far between: Many patients are diagnosed when their disease has spread and the prognosis is grim. But in a packed hotel conference room in a Maryland suburb last month, optimism occupied roughly two dozen seats near the front—a cluster of patients whose doctors had expected them to die long ago. One, Adriane Riddle, had traveled from San Bernardino, California, to tell the assembled doctors, drug company representatives, and regulators about the miracle drug she believed saved her life when she was diagnosed with metastatic lung cancer two summers ago at a startlingly young age. “It has given me a chance to turn 20,” she said of Iressa, an experimental drug in a new class of targeted cancer therapies.

    Although Riddle radiated good health, Iressa's future is far from ensured. Data from several clinical trials have been inconsistent, and some have been flatly disappointing. Iressa's creators at AstraZeneca, in London, frankly admit that they have no idea who will benefit from the drug or why—but some clearly do. Even so, Iressa edged a step closer to the market at the 24 September meeting, when the Food and Drug Administration's (FDA's) Oncology Drug Advisory Committee voted 11 to 3 to recommend that the agency approve the drug to treat a common form of lung cancer, known as non-small cell lung cancer.

    FDA normally follows its advisory council's suggestions. But at the meeting, the agency voiced concerns about ambiguous clinical trial results, questioning whether the drug works as well as AstraZeneca claims. The agency now has less than 6 months to decide whether to allow Iressa on the market. The lobbying has been intense. Some oncologists, having watched the drug shrink tumors in some patients, inundated the FDA with letters begging it to approve Iressa. And the day the council met, an editorial in the normally staid Wall Street Journal attacked the FDA for its reluctance to endorse Iressa and other cancer drugs. The article's title: “FDA to Patients: Drop Dead.”

    Iressa is one in a new breed of cancer therapies: targeted drugs that home in on molecules critical to cancer. Unlike old-style chemotherapy that inflicts major collateral damage on healthy cells, these drugs should act like laser-guided missiles. For more than a decade, researchers and drug companies alike have been heralding their imminent arrival. Now, with a half-dozen drugs in late-stage clinical trials and some already on the market, such as the breast cancer drug Herceptin, the future has arrived. But it's not quite as rosy as some predicted. As the experience with Iressa demonstrates, designing, testing, and evaluating these therapies are proving more challenging than many expected.

    Miracle drug?

    In scattered cases, Iressa's impact is remarkable: Here, a lung tumor (top, right side) nearly disappears after treatment (bottom).


    The most celebrated new drug in this class is Gleevec, designed by Basel, Switzerland-based Novartis to treat chronic myeloid leukemia (CML). Approved in May 2001, Gleevec, used in ongoing therapy, sends nearly 100% of recently diagnosed patients into remission, and without the tortuous side effects of chemotherapy. Although not without problems, the drug has become the poster child for targeted cancer therapies, providing proof of principle that this approach—identifying and then disabling the molecular mechanisms that give rise to a cancer—is indeed a potent weapon.

    But as other targeted drugs wend their way through clinical trials, it is becoming increasingly clear that few will enjoy the smooth passage of Gleevec. Gleevec attacks a relatively rare cancer that arises from a single molecular defect. More prevalent (and, from a company perspective, more lucrative) tumors, such as those in the breast, colon, and lung, are also molecularly far more complex. A host of abnormalities may be at play, and targeting just one of them may not be sufficient.

    Iressa is a case in point. Although the drug disrupts a specific pathway (the same one targeted by several other experimental therapies), oncologists don't fully understand the role that pathway plays in lung cancer. That has translated into puzzling study results and confusion over how well Iressa really works.

    “It is terrific to finally have some agents that are born out of the last 20 years of research and that are targeted to a specific abnormality,” says oncologist Paul Harari at the University of Wisconsin, Madison. Still, “you can't wed yourself too closely [to a drug], you can't be blinded by your heart,” says Alan Sandler, medical director of thoracic oncology at Vanderbilt University in Nashville, Tennessee. “You want it to work so badly.”

    Hunting a tumor's Achilles heel

    Knowing what makes a killer tick is key to stopping one, and tumors are no different. From that perspective, CML was an ideal candidate for a targeted drug. Oncologists have known for decades that the leukemia cells of sufferers share a genetic oddity called the Philadelphia chromosome, an abnormal portion of a chromosome that arises when parts of two others are swapped, or translocated. Testing in petri dishes and in animals strongly suggested that this translocation was a necessary prerequisite to CML, because it generates a protein called bcr/abl that spurs uncontrolled cell proliferation. Leukemia experts were convinced that if they could find a way to cripple this single defect—bcr/abl—the blood and bone marrow would be wiped clean of leukemia cells.

    Their hunch proved largely correct: Gleevec sailed through clinical trials and approval in just 3 years. Disabling that one molecule was sufficient to put 96% of early-stage patients into remission. Sixty percent in a later phase of disease also benefited.

    Applying this strategy to lung and other more common cancers, however, has proved much more daunting. The first hurdle is identifying the right target. Iressa homes in on a set of molecules called epidermal growth factor receptors (EGFRs). These receptors are overexpressed in a range of tumor types and correlate with a poor prognosis. Iressa uses EGFR as an entry into a tumor cell. Once inside, the drug prevents a cascade of signals from eventually reaching the cell's DNA and encouraging cell division. That approach, reasoned drug developers at AstraZeneca, should stop cancer in its tracks, while sparing cells with little or no EGFR.

    The logic was persuasive: Indeed, Iressa is just one of four anti-EGFR drugs being tested in large trials, although not all work in exactly the same way. The others are Tarceva, from Genentech in San Francisco, California, and OSI Pharmaceuticals in Melville, New York; ABX-EGF from Abgenix in Fremont, California; and ImClone Systems' Erbitux, which is still being tested despite the controversy surrounding the company.

    “You can't wed yourself too closely [to a drug], you can't be blinded by your heart. … You want it to work so badly.”

    —Alan Sandler


    Based on early data from clinical trials, FDA was sufficiently interested in Iressa to give it “fast-track” status last year—the chance to be considered with less data, so the drug could reach cancer patients sooner.

    So far, however, results on Iressa and other anti-EGFR drugs have been mixed. Few data are available yet on Tarceva and ABX-EGF. In one reported Tarceva trial of 34 women with advanced ovarian cancer, Genentech found that tumors shrunk somewhat in just four. Physicians who've treated patients with various anti-EGFR therapies agree that the drugs can have a remarkable impact—but invariably on just a minority of patients. This holds true for Erbitux as well. FDA declined to approve Erbitux in December, because, it said later, ImClone's trials were poorly designed, and the company is under investigation for misleading investors. Even so, doctors are convinced the drug sometimes works. “Erbitux is clearly a drug that has activity in colorectal cancer,” says Leonard Saltz, an oncologist at Memorial Sloan-Kettering Cancer Center in New York. “I don't think it would be appropriate to give up on it.”

    Many physicians hold the same view of Iressa, even though data from its clinical trials showed that attacking an EGFR pathway usually wasn't sufficient to halt cancer's spread. Although most of the tumors treated in the Iressa trials were thought to express or overexpress the receptors, most tumors failed to shrink and often continued growing. In 10% of patients, however, the therapy dramatically cut tumor size, characterized as a “response”—one of several FDA-sanctioned measures of a drug's effectiveness. The reason for the uneven response, oncologists now suspect, is that although EGFR overexpression is common, it's critical to tumor growth in just a minority of cases. In short, an overabundance of EGFR doesn't necessarily mean that a cancer is dependent on EGFR for its survival.

    So how can EGFR-dependent tumors be identified so the drug can be applied to tumors where it will likely work? “No one really knows,” says oncologist Roy Herbst of M. D. Anderson Cancer Center in Houston, Texas, who participated in the Iressa trials.

    Even in those tumors that do overexpress EGFR, oncologists speculate, overexpression may be just one of a cluster of abnormalities driving cancerous growth. “To use an analogy, think about New York City,” says Saltz. “If you try to block traffic going down Second Avenue, cars might just go to Lexington and go that way—but if you block First, Second, Third, Lexington, and Park”—well, then you'd have a serious traffic jam. CML, the cancer that Gleevec treats, may be the bucolic New Hampshire town, where blockading Main Street shuts down activity. But lung cancer is the Manhattan nightmare, with traffic hurtling over a jumble of roads that crisscross the tumor.

    “As with so many other things, it's a lot more complicated than we thought,” says Harari.

    Trials and tribulations

    Designing clinical trials is tough for conventional chemotherapies. When it comes to targeted treatments, the challenges multiply.

    The handful of targeted cancer therapies on the market were initially tested in narrow patient populations: Gleevec only on CML, for example, and Herceptin only on women overexpressing the HER-2 protein. Had either of these drugs been tried in a broad swath of volunteers, say oncologists, they would have generated unimpressive results.

    “My challenge to the companies [with anti-EGFR drugs] would be, design your trial to select the patients who have the best chance of responding,” says Charles Sawyers, an oncologist at the University of California, Los Angeles, Jonsson Cancer Center.

    But AstraZeneca couldn't do that with Iressa. EGFR expression can be difficult to measure and, complicating matters, it doesn't always seem to correlate with tumor shrinkage. So AstraZeneca opened its trials initially to patients with non-small cell lung cancer who had failed two chemotherapy regimens.

    Big shoes to fill.

    Drug companies hope new therapies like Iressa will prove as effective as Gleevec.


    In a phase II trial of 216 patients with advanced lung cancer, Iressa, administered alone, brought significant improvement—dramatic tumor shrinkage in 10% of patients and, the company reported, easing of symptoms in 40%. Although 10% may sound low, that number is encouraging for a cancer so tough to treat. AstraZeneca, oncologists, and even FDA were optimistic that the drug would enhance survival, so the company began testing it as the first treatment after diagnosis. “We certainly all expected the first-line trials to be positive,” says Ronald Natale, acting director of Cedars-Sinai Comprehensive Cancer Center in Los Angeles and a lead figure in the Iressa trials.

    But that wasn't the case. A phase III trial with more than 1000 recently diagnosed lung cancer patients (two-thirds received Iressa along with standard chemotherapy as their initial treatment and one-third received a placebo with chemotherapy) showed that Iressa had absolutely no effect on how long patients lived.

    “We were kind of floored,” admits Richard Pazdur, head of oncology drug products at FDA, of the results, which were announced in August a few months after phase II data were presented at a cancer meeting. On the strength of the phase II trial in late-stage patients, he says, the agency would have quickly approved the drug. Indeed, Japan evaluated Iressa before the phase III results were announced and approved it last summer.

    But the new results on combination therapy have raised significant doubts. No one can explain why mixing Iressa with chemotherapy renders it ineffective, although Natale speculates that chemotherapy agents somehow prevent the drug from working properly. Because experimental drugs are given only to those who have exhausted existing treatment, or in combination with it, no one knows whether giving Iressa alone soon after lung cancer is diagnosed would help more than standard chemotherapy treatments.

    Flummoxed, FDA turned to its advisory panel for help. After struggling with the inconsistent trial results at its meeting last month, the panel concluded that Iressa was clearly helping some patients, including those seated in the room, and recommended approval.

    View this table:

    Iressa is not the only drug in this promising class to have suffered setbacks. Last month, Genentech announced that Avastin—a drug that targets vascular endothelial growth factor (VEGF), which supports new blood vessel growth—failed to extend the lives of women with breast cancer when combined with standard chemotherapy. The company is now awaiting results from a comparable trial in colon cancer. (Avastin was also generally combined with chemotherapy, in various cancers, in the company's phase II trials.)

    Even Gleevec has run into some trouble. About 80% of late-stage patients relapse while on the drug, compared to 1.5%, so far, in the newly diagnosed group. Analysis of leukemia cells suggests that those from late-stage tumors harbor many more mutations than early stage, so disabling one may not stop the tumor (Science, 3 August 2001, p. 876).

    No easy answers

    Although enthusiasm for targeted drugs has dampened, it has by no means disappeared. “We all know these drugs work; we've all had patients who've benefited from them,” says Eric Rowinsky, director of clinical research at the Cancer Therapy and Research Center in San Antonio, Texas. But, says the National Cancer Institute's James Yang, “this is a field that isn't going to [yield] quick and simple answers.”

    On the bright side, some recent evidence hints that these drugs may have unexpected versatility. In 1999, for instance, George Demetri, head of the sarcoma center at Harvard's Dana-Farber Cancer Institute, and his colleagues in Boston and Oregon discovered that Gleevec, designed to turn off bcr/abl molecules at the heart of CML, also attacks another, closely related molecule, c-kit, that's at the root of gastrointestinal stromal tumor (GIST). Like CML, GIST is relatively rare and genetically quite simple. Physicians are now experimenting with the drug on other cancers. And although Iressa has been tested only on non-small cell lung cancer, physicians have tried other EGFR drugs on a range of cancers—colon, head and neck, and ovarian—with hints of success.

    Some speculate that Iressa and its brethren may be more effective when used in combination with other targeted therapies, in essence attacking different molecular targets simultaneously. Sloan-Kettering's Saltz warns, however, that the risk of killing normal cells multiplies with the number of targets being attacked, because healthy cells can rely on some of the same receptors as malignant ones.

    Nevertheless, one experimental combination study is already in the works with two Genentech drugs: Avastin, designed to fight new blood vessel growth, and Tarceva, designed to target EGFR. The company plans to test the pair together against lung cancer. It's logical, says Vanderbilt's Sandler, who with M. D. Anderson's Herbst is running the trial. But, he cautions, “things that make sense don't always work in medicine.”


    Tiny Worm Takes a Star Turn

    1. Jean Marx

    The nematode known as Caenorhabditis elegans is not much to look at. Just a millimeter long and transparent to boot, it is almost invisible to the naked eye. But in biological research the tiny worm looms large, providing a model system for studying everything from embryonic development to aging. Now, three researchers who pioneered the use of C. elegans as a model organism have won the Nobel Prize in Physiology or Medicine.

    The Nobel Academy cited the three—Sydney Brenner of the Salk Institute for Biological Studies in La Jolla, California, and the Molecular Sciences Institute in Berkeley, California; H. Robert Horvitz of the Massachusetts Institute of Technology (MIT) in Cambridge; and John Sulston of the Wellcome Trust Sanger Institute in Cambridge, U.K.—particularly for discoveries concerning the genetic regulation of organ development and programmed cell death, a type of cellular suicide that helps sculpt the organs of developing organisms and regulate cell growth in mature ones.

    “I was just jumping for joy all day,” says Judith Kimble of the University of Wisconsin, Madison, who has studied C. elegans for more than 2 decades. Long-time cell death researcher Stanley Korsmeyer of Harvard's Dana-Farber Cancer Institute is also thrilled. “It's a wonderful thing for both [the cell death and worm] fields,” he says.

    Brenner won special plaudits for recognizing the potential of C. elegans in biology and then developing it as a model in work that dates back to the 1960s, when Brenner was at the MRC Laboratory of Molecular Biology in Cambridge, U.K. Early on, Horvitz recalls, many researchers viewed the worm as “Sydney's idiosyncrasy,” lagging too far behind the fruit fly to contribute much to genetic studies. But that didn't faze Brenner. “In the case of Sydney Brenner, this award has been so overdue,” notes worm researcher Martin Chalfie of Columbia University in New York City.

    C. elegans makes a good experimental model precisely because it is so small—it contains only about 1000 cells—and because its life-cycle lasts just 3.5 days. Yet the worm is complex, consisting of a variety of cells and tissues, including a nervous system. These qualities make it easy for researchers to trace the cellular effects of mutations, thus opening the way to identifying the genes involved in development of the different cell types. Brenner gave the work an early boost by showing that he could induce mutations in the worm with a chemical known as EMS. The result was what Chalfie calls “a veritable gold mine of genes,” many of which turned out to be involved in nerve cell development and function.

    Apostles of elegans.

    John Sulston, Sydney Brenner, and H. Robert Horvitz (top to bottom).


    Sulston joined Brenner's Cambridge group in 1969. Over several years, he traced the lineages of the cells that form the worm's nervous system and then completed tracing the lineages of all the cells that make up the adult worm. Although difficult and painstaking, Sulston says, the work was far from tedious. “For me, it was absolutely entrancing,” he recalls. “Because one feels it is all going forward, it is worth doing.”

    Among other important findings, the lineage work showed that each individual worm is formed by exactly the same series of cell divisions. In addition, Sulston found that 131 cells, mostly those in the nervous system, undergo programmed cell death. Neurobiologists already knew that neuronal death helps form the nervous systems of mammals, but now researchers had an animal with which to explore the hows and whys of that cell death.

    That's where Horvitz's contribution comes in. After completing his graduate work at Harvard in 1974, Horvitz joined Brenner's group in hopes of using C. elegans to study the nervous system. While in Cambridge he helped with Sulston's lineage-mapping project, and on taking a faculty position at MIT in 1978 began a series of studies aimed at identifying developmental mutations in the worm. By the mid-1980s, that work led him to two genes, ced-3 and ced-4, needed for normal cell death during C. elegans development, plus a third, called ced-9, that protects against cell death.

    Horvitz “was the first to show that there was a genetic basis for cell death,” says Vishva Dixit of Genentech Inc. in South San Francisco. “The discovery really illuminated a new pathway to explore.” Researchers soon found that mammalian cells contain similar death genes.

    Disturbances in cell death pathways have medical implications. Excessive death has been linked to the neurological damage of stroke and Alzheimer's disease, for example, and cancer may result if cells fail to die when they should. Researchers are now exploring ways to treat neurological diseases by blocking cell death and, conversely, to trigger it in cancerous tumors. “In a relatively short time we came from a genetic basis [for cell death] to therapeutic consequences,” Dixit says.

    Although the Nobel Committee focused on cell death, Chalfie and others point out that the three new laureates have made other major contributions. For example, Sulston was a prime mover behind the recently completed sequencing of the human and worm genomes—work not mentioned in the Nobel citation—and Horvitz helped trace out an important pathway that regulates development both in the worm and in higher organisms. Says cell death researcher Dale Bredesen of the Buck Institute in Novato, California: “There's no question that these guys have done very exciting work that spans cell death and many other areas as well.”


    Neutrino Traps and X-ray Eyes

    1. Charles Seife

    Three physicists have won the ultimate scientific accolade for giving humanity new eyes. Half of this year's Nobel Prize in Physics went to Ray Davis of the University of Pennsylvania, Philadelphia, and Masatoshi Koshiba of the University of Tokyo for using neutrinos to gain insight into the cosmos. The other half went to Riccardo Giacconi of Associated Universities Inc. (AUI) of Washington, D.C., for a 4-decade-long effort to view the universe with x-ray spectacles.

    Davis and Koshiba dedicated their careers to hunting neutrinos, nearly massless elementary particles. When Davis started his quest in the late 1950s, physicists knew that Earth must be flooded with neutrinos emanating from the sun, but nobody knew how to detect them. Davis realized that he could use a rare reaction in which a neutrino strikes a chlorine atom, turning a neutron into a proton to produce argon, to sense neutrinos created by the decay of boron-8 in the sun.

    Deep in the Homestake Gold Mine in South Dakota, Davis filled a 37,850-liter tank with chlorine-rich fluid and then painstakingly scoured it for mere handfuls of argon atoms. After decades of effort, Davis showed, to physicists' shock, that Earth was being pelted with roughly a third as many neutrinos as theory predicted. This “solar neutrino problem” became an enduring puzzle in physics—and one of the first signs that neutrinos have mass.

    Koshiba and colleagues took the next step. They, too, used a mine to shield a vast tub of fluid from cosmic rays and radiation. But instead of chlorine, the Kamiokande detector, built near Kamioka, Japan, in 1982 and 1983, sought traces of neutrinos in water.

    Cosmic vision.

    Ray Davis, Masatoshi Koshiba, and Riccardo Giacconi (top to bottom).


    When a neutrino hits a particle in a tank of water, the collision can trigger a brief flash—the optical equivalent of a sonic boom. By detecting flashes with sensitive photodetectors, the Kamiokande team could not only detect neutrinos, but also tell where they were coming from. Kamiokande confirmed Davis's solar-neutrino paradox and later spotted neutrinos from Supernova 1987A, an exploding star 170,000 light-years away. Successors to Davis's and Koshiba's experiments are revealing the nature of neutrinos and shedding light on processes deep inside the sun. Among other things, they have solved the solar-neutrino paradox by showing that neutrinos change type, or “flavor,” en route from the sun to Earth. Davis and Koshiba “have an important part in the history of the subject,” says physicist John Bahcall of Princeton University. “[Their work] won't be described by paragraphs in the textbooks, but by chapters.”

    While Davis and Koshiba were equipping astronomers with neutrino eyes, Riccardo Giacconi worked with light—but light of a particularly troublesome type. Unlike visible light, x-rays are absorbed by the atmosphere and zoom right through mirrors. Their bothersome optical properties made x-ray astronomy impractical until scientists could loft compact detectors above the atmosphere. In 1962, Giacconi loaded a sensitive version of a Geiger counter aboard a sounding rocket. After two failed launches, the Aerobee rocket soared to 224 kilometers above Earth and, for the first time, saw x-rays emanating from the sun.

    In 1960, Giacconi and Bruno Rossi of the Massachusetts Institute of Technology (MIT) figured out a clever way to focus x-rays onto a detector by skimming them along a surface, rather than bouncing them off a mirror as in conventional optics. The technique dramatically increased the sensitivity of x-ray telescopes. Since then, Giacconi has been involved in most of the major advances in x-ray astronomy, including 3 decades of satellite observations that have given astronomers crucial information about black holes, star formation, galactic nuclei, and other energetic events and objects. “He's an excellent choice. It's one of those cases that many of us thought should have happened years ago and then we gave up hope,” says Claude Canizares, an x-ray astronomer at MIT. “He's just a giant in the field.”

    Giacconi—who went on to run NASA's Space Telescope Science Institute in Baltimore, the European Southern Observatory in Munich, and now AUI, the organization that manages the National Radio Astronomy Observatory—is just pleased to have been a part of such eye-opening research. “X-rays give you the key to phenomena of cosmic evolution,” he says. “I was lucky enough to get involved right at the beginning.”


    Mastering Macromolecules

    1. Adrian Cho*,
    2. Dennis Normile
    1. Adrian Cho is a freelance writer in Grosse Pointe Park, Michigan.

    With the genomes of several organisms in the bag and others soon to follow, biologists are turning their attention to the myriad proteins those genes create. Researchers in the burgeoning field of proteomics are working to determine the sequence of amino acids that make up each giant protein molecule and to deduce the molecule's shape, which determines how it behaves. This year's Nobel Prize in Chemistry honored three researchers whose discoveries have made such studies possible.

    Half of the prize went to John Fenn of Virginia Commonwealth University in Richmond and Koichi Tanaka of Shimadzu Corp. in Kyoto, Japan, who independently developed techniques to ionize large molecules such as proteins. Kurt Wüthrich of the Swiss Federal Institute of Technology in Zürich received the other half for developing nuclear magnetic resonance (NMR) techniques that reveal the molecules' shapes.

    Working independently, Fenn and Tanaka discovered ways to give huge molecules an electrical charge without ripping them apart. The charged molecules, or ions, can then be fed into a mass spectrometer to determine their masses and, after further analysis, their amino acid sequences. Such studies wouldn't work without the prize-winning innovations, says John Yates, a mass spectrometrist at the Scripps Research Institute in La Jolla, California. “The ionization techniques are the horses that pull the cart,” Yates says.

    Fenn's technique, called electrospray ionization, begins with a solution of the molecules. A high voltage draws electrically charged droplets from a hollow needle, and these quickly evaporate, leaving behind the freely floating ions. Tanaka's technique, soft laser desorption, begins with a mixture of the jumbo molecules and smaller, light-absorbing molecules on a surface. A blast of laser light heats the absorbing molecules, causing tiny explosions that charge the big molecules and loft them into the air. Since their invention in 1987, the two techniques have become ubiquitous in academic and pharmaceutical laboratories.

    Proteomics pioneers.

    Kurt Wüthrich, Koichi Tanaka, and John Fenn (top to bottom).


    In the 1980s Wüthrich found ways to determine the shape of a very large biomolecule by studying how the hydrogen nuclei within it wobble when exposed to carefully tuned magnetic fields, a phenomenon known as NMR. Because the rate of nuclear wobble depends on the strength of both the applied fields and the magnetic fields from nearby atoms, each hydrogen nucleus will move at a slightly different rate and give off a radio signal of a slightly different frequency. Wüthrich found that he could match the different signals with individual nuclei and that correlations between signals could reveal the location of each hydrogen nucleus—and thus the structure of the molecule. “He really has started a new field, and the field is pretty big,” says Ad Bax, a biophysicist at the National Institute of Diabetes and Digestive and Kidney Diseases in Bethesda, Maryland.

    All three laureates say the award took them by surprise. During a hastily called press conference at Shimadzu headquarters, Tanaka appeared unshaven in the gray corporate uniform common among ordinary company employees in Japan. “If I had had any idea,” he said, “I would have put on a proper suit.” Wüthrich, 64, says his secretary “broke all the well-established rules” and called him out of a student seminar to relay the phone message from the Royal Swedish Academy. “I think I have not fully realized that I got it,” he says.

    Tanaka, 43, did his award-winning work in his late twenties. In contrast, Fenn was nearly 70 when he developed his technique. The 85-year-old says he has derived great satisfaction from the utility of his idea, but he's equally happy to get the prize. “The fame is nice,” Fenn says, “and I'll enjoy it.”


    Lab-Based Researchers Earn Prize in Economics

    1. Constance Holden

    The Nobel Prize in Economics this year goes to pioneers in two fields that many economists believe are more than ripe for recognition. The $1.1 million bounty will be shared by Daniel Kahneman of Princeton University in New Jersey, who has integrated psychology into economic theory, and Vernon Smith of George Mason University in Fairfax, Virginia, who has turned economics into an experimental science.

    Kahneman's selection “was no surprise,” says Princeton economist Michael Rothschild, who says he and a number of others had bet on him in the economics department's Nobel pool. People have been impressed by his work for the past 30 years, says Rothschild, but “it took a long time” for researchers to figure out how to integrate it into economics. The honor for Smith (who says, “I've been hearing this rumor since 1980”) is also “long overdue,” says economist Charles Plott of the California Institute of Technology in Pasadena.

    Economic theory supposes that individuals act with perfect rationality in seeking to optimize their gains. But cognitive psychologist Kahneman, 68, working with Amos Tversky of Stanford University, who died in 1996, identified various nonrational “heuristics”—or mental shortcuts—people use in arriving at decisions, especially in uncertain situations.

    Big gains.

    Daniel Kahneman (top) and Vernon Smith (bottom) expanded what economists study.


    Kahneman has demonstrated, for example, that people are generally more adverse to the risk of losing something than they are attracted to the idea of gaining the same thing. Students who had been given coffee mugs at the beginning of an experiment wanted twice as much money to part with them as mugless students were willing to pay to acquire one.

    Kahneman also showed that people's willingness to accept risk depends on how different options are phrased. Presented with an epidemic that threatens to kill 600 people, subjects must choose between one response that saves two-thirds of the population and another that may save all but could save just one-third. People choose definite numbers if the options are expressed in numbers of lives saved, but gamble when the outcome is presented in terms of deaths. Kahneman's work, says the Bank of Sweden committee that bestows this prize, has given rise to a whole new field of “behavioral economics.”

    The work complements research by Smith, 75, who has demonstrated that—contrary to many economists' assumptions—economic behavior could be dissected in the lab. “Smith initiated the notion that you could actually use experiments to check the predictions of economic theory, particularly the actions of markets,” says Rothschild.

    Smith's experiments, which he refers to as “wind tunnels” of economics, have ranged from testing competing models for energy deregulation to allocation of airport gate time slots. The Nobel Committee says that Smith's work has opened up vast new possibilities in the auction world. He has shown experimentally, for example, that “two-sided” auctions such as the stock market (in which both bidders and askers are yelling out prices) function even with small numbers of players. Smith says his findings have led to the design of new auction systems covering goods from utilities to electromagnetic spectra.

    Rothschild says the field of economics is moving more to the experimental and the behavioral. Indeed, the next prize may not even be in economics, says Smith: “There is interest in the [prize committee] in moving the economics prize to a more general social science award.”


    Researchers See Progress in Finding the Right Balance

    1. David Malakoff

    House Science Committee hearing explores a perennial debate between scientists and the government that was reignited by the terrorist attacks

    After years of controversy, U.S. scientists reached a deal of sorts with the government nearly 2 decades ago on how to handle information that might threaten national security: Classify some things, and don't touch the rest. That arrangement served the community well as the Cold War ended and the United States emerged as the world's paramount military power. Then came the 11 September terrorist attacks and the anthrax letters. Suddenly people were talking about gray areas of security, self-censorship, and prepublication reviews.

    Last week, barely a year after those horrific events, the House Science Committee convened a hearing to explore the proper balance between science and security. And a panel of government and academic leaders suggested that the country might be close to finding a new equilibrium point in this chronic debate. “I'm increasingly optimistic that we are going to be able to arrive at … constructive oversight,” says Ron Atlas, a dean at the University of Louisville, Kentucky, and president-elect of the American Society for Microbiology (ASM), which has been deeply involved in efforts to shape new research regulations.

    Atlas and other science leaders readily admit that they are not out of the woods. The government has yet to release the details of how it plans to implement several policies unveiled since 11 September. That list includes rules governing researchers who work with potential bioweapons, the shape of a new government committee that will screen foreign graduate students entering certain fields, and guidelines on how the heads of several major departments—including the one that oversees the National Institutes of Health (NIH)—can wield new powers to classify research results. Scientists are also closely watching a White House effort to define government information that, although not important enough to be classified, is “sensitive” enough to remain hidden from public view. And although science lobbyists feel that they have won approval for most of their organizational suggestions, the proposed Department of Homeland Security is still awaiting final congressional action.

    Much of last week's hearing, which included testimony from House science adviser John Marburger, revolved around a White House order issued last March for government agency heads to withhold “sensitive but unclassified information” that might aid terrorists. Although the memo did not define sensitive material, it prompted some agencies to delete documents from Web sites and withdraw information on everything from the history of chemical warfare to the characteristics of oil refineries. Those moves also coincided with reports that the Department of Defense and other agencies were pressuring university grantees to submit basic research results to government reviewers for vetting before publication.

    Although most universities have rebuffed the requests, saying that they violate existing Pentagon policies, concerns grew after the White House announced this summer that it planned to issue regulations shortly that would flesh out a new category of “sensitive homeland security information.” The news raised fears that the government was rewriting a hard-won compromise spelled out in a 1985 presidential directive that says the government should classify information that poses a threat to security and leave all else in the public realm.

    Be careful.

    Ron Atlas, M. R. C. Greenwood, and Sheila Widnall say new rules about sensitive data should be clear.


    Marburger went out of his way at the hearing to reassure academic researchers that the pending rules would not hinder them. It is “incorrect,” he said, to say “that the Administration is considering a policy of prepublication review of sensitive federally funded research.” The real goal, he said, is to find ways to shelter limited classes of government information—such as bioterror response plans or safety-related data—that might aid terrorists. Along the way, the White House has held several meetings with science and university groups.

    Marburger's outreach has gotten high marks from science advocates. “The Administration has made an effort to listen,” says Toby Smith, a Washington, D.C.-based lobbyist for the University of Michigan, Ann Arbor, one of the nation's largest research institutions. Atlas also congratulated Congress and the Administration for crafting new rules, such as the bioterror law, that “represent a balanced approach.”

    But Atlas and others remain concerned that any new regulations be sufficiently clear to prevent risk-averse bureaucrats from interpreting them in a repressive manner. “The situation opens [researchers] to potentially arbitrary dictates, however well intended,” from government managers, noted Sheila Widnall, an aeronautics professor at the Massachusetts Institute of Technology in Cambridge and a former Secretary of the Air Force. “The right approach … is to identify precisely the specific areas that require classification and to build very high walls.”

    A clear definition of what constitutes a threat, panelists said, is also needed before the government decides how to screen foreign graduate students who want to study certain sensitive fields. Marburger noted that the planned screening committee will include experts from government science agencies, and he expects the number of fields covered to be small. But M. R. C. Greenwood, chancellor of the University of California, Santa Cruz, argued that keeping foreign doctoral students out of the United States might do little to prevent terrorists from acquiring desired skills—in part because of the rising number of science and engineering Ph.D.s being produced by European and Asian universities. “In some ways,” she said, “[this] is a modern version of closing the barn door after the horse has left.”

    Science committee chief Sherwood Boehlert (R-NY) said his panel plans further hearings. And he's not the only one interested. The National Academy of Sciences is planning a series of university-based workshops for researchers. Biologists, meanwhile, are discussing new voluntary guidelines on publishing potentially dangerous information, in part to head off possible government rules. The challenge, says Boehlert, is to “figure out how science should operate in a brave new world.”