News this Week

Science  26 Feb 1999:
Vol. 283, Issue 5406, pp. 1234

    Bioterror Defense Initiative Injects Shot of Cash

    1. Eliot Marshall

    Johns Hopkins University staffers were nervous in January that they wouldn't be able to fill the seats at what they were billing as the “first ever” national health conference on how to defend civilians against bioterrorist attack. They need not have worried. When the symposium* came to order last week, the seats were full. “We could have sold another 300 places,” said an ebullient organizer. Reporters from dozens of news outfits lined a hotel ballroom in Crystal City, Virginia. And an audience of more than 900 turned up to hear the kickoff speech by Secretary of Health and Human Services (HHS) Donna Shalala, who said the meeting “will help replace complacency with a new sense of urgency.”

    Bioterrorism is suddenly on the map, bringing a major funding boost for research and defensive measures. This year alone, Shalala noted, the president's $1.4 billion antibioterrorism agenda will channel $158 million to HHS, and she is seeking an increase of $72 million next year. “I learned that the Administration is serious about setting up national defenses against bioterrorism,” said Raymond Zalinskas, a former member of the team that inspected Iraq's bioweapons effort and who now works for the Monterey Institute of International Studies in Washington, D.C. It's “tremendous,” he says, that the government is doing something.

    This sense of urgency pervaded the meeting, which also kicked off a new Hopkins center on antiterrorism, headed by former Hopkins public health dean and antismallpox crusader Donald “D. A.” Henderson. Attendees reviewed frightening scenarios—models in which U.S. cities were hit with anthrax or smallpox bombs—looking for weak points in the public health system. Speakers contended that hospitals and emergency services are woefully unprepared for a real emergency, such as a smallpox outbreak. And public health officials debated whether 20-year-old vaccine stocks would be adequate.

    The new initiative is meant to address these gaps with a national communication network, retraining for medical and emergency crews, and R&D on new drugs and vaccines. A federal coordinating committee headed by National Security Council staffer Richard Clark is already in place at the White House, and programs are unfolding in several agencies. But even Zalinskas notes that discussions about the threat contain “some hype.” And a few skeptics—such as Israeli political scientist Ehud Sprinzak of Hebrew University—suggest the “craze” over biological threats is bad policy, noteworthy for making “a few defense contractors very rich.” It remains to be seen how long the policy will endure.

    Until recently, “bioterrorism wasn't on the agenda” for most public health researchers, says Peter Jahrling, chief scientist of the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) at Fort Detrick, Maryland. “They had important things to take care of, like Salmonella in the potato salad,” he says. So biodefense was largely assigned to military agencies. But attitudes have changed, partly as a result of high-profile revelations, such as the disclosure last year by defector Ken Alibek that the Soviets developed systems for loading smallpox into ballistic missiles. Such surprises have led to a “real culture shift” among public health officials and scientists, says Jahrling.

    Henderson is one of the prominent biomedical figures who's heading the campaign (see Review, p. 1279). Molecular biologist Joshua Lederberg, the president emeritus of Rockefeller University, is another. Winner of a Nobel Prize in 1958, Lederberg chaired an advisory committee that helped guide the Defense Advanced Research Projects Agency (DARPA) as it made its first investments in biological research 3 years ago. DARPA's spending on biodefense has grown rapidly since then and is projected to nearly double over the next year (see table).

    View this table:

    The HHS effort got a boost from an ad hoc “round table” of seven presidential advisors, chaired by Lederberg, which included genome researcher J. Craig Venter, head of Celera Genomics Inc. of Rockville, Maryland, and Thomas Monath, a vaccine expert and executive at OraVax Inc. of Cambridge, Massachusetts. At a meeting with Clinton on 10 April 1998, the group sketched out bioterror scenarios and answered Clinton's questions. Six weeks later, the president announced he was developing an antibioterrorism initiative, which would include a vaccine and medicine stockpile. Congress, already concerned about terrorist threats, passed an “emergency” appropriation last fall, sending more than $150 million in HHS's direction. The bill also earmarked $1 million for the civilian bioterror defense research center at Hopkins.

    Since December, the Administration has been working out the details of how it will spend all this new money. Most of HHS's funds will be passed through the Centers for Disease Control and Prevention (CDC) to local public health and disaster agencies. The goal, assistant HHS secretary for planning Margaret Hamburg said last week, is to build connections between emergency crews who first respond to a crisis and agencies like CDC that have expertise in exotic diseases. Most doctors today have never seen a smallpox or anthrax case, so the network will serve in part to educate physicians about recognizing and treating these diseases. It will also provide a secure communications link in a crisis.

    HHS is planning to spend $51 million on a stockpile of drugs and vaccines, says James LeDuc of CDC's National Center for Infectious Disease. The CDC's two top concerns are anthrax, a tough bacillus, and the variola virus that causes smallpox. Anthrax is treatable with antibiotics if detected quickly, but it's hard to spot an infection early, and it kills quickly. Anthrax usually is not communicated from person to person. Smallpox, on the other hand, is highly contagious, and would cut a devastating swath through unvaccinated urban populations.

    CDC has about 15 million doses of smallpox vaccine in its 20-year-old reserve, but because the rubber seals are deteriorating, about a quarter are suspect. This vaccine should be replaced, preferably with a new type, LeDuc says, produced with modern techniques. (The existing vaccine is neither sterile nor pure.) But establishing the safety and efficacy of a new vaccine could be difficult, because there are no smallpox patients to test it on. CDC is talking to the National Institutes of Health (NIH) and USAMRIID about developing animal models for testing a new formula.

    Researchers are also debating the wisdom of a plan to destroy the last stocks of smallpox virus, at CDC and at the Research Institute for Viral Preparations in Moscow. Henderson and some other researchers argue that the destruction, scheduled for June, would reduce the risk of a smallpox attack by keeping the virus out of terrorist hands. But some argue that stocks should be preserved to help develop new drugs and vaccines (Science, 19 November 1993, p. 1223 and 1225.)

    For anthrax, there are no civilian vaccine stocks at all: Supplies have been purchased by the Defense Department for the troops, and the sole factory that makes the vaccine is shut for renovation. CDC officials agree that it will be necessary to develop a new anthrax vaccine soon. USAMRIID has candidates in development, but bringing them through a series of clinical trials will be costly. A new version may be ready by 2005. In the meantime, CDC will store up antibiotics.

    Another chunk of money, about $25 million, will go to NIH for basic science supporting vaccine and drug research. The bulk of it will be channeled through NIH's National Institute of Allergy and Infectious Diseases (NIAID) to extramural grantees for genetic studies of pathogens (anthrax, smallpox, plague, and tularemia). Starting next year, according to NIAID's Catherine Loughlin, “We'd like to take advantage of the genomic information to identify targets” for drug and vaccine development.

    One promising therapeutic development, according to NIAID staffer Bernard Moss, comes from an agency that isn't earmarked for a funding boost—USAMRIID. There, microbiologist John Huggins has been screening licensed antiviral drugs to find some that might help combat smallpox. Using a mouse model of smallpox he developed, Huggins found a good candidate: cidofovir, a drug used mostly by AIDS patients for cytomegalovirus eye infections. But it must be given intravenously, and it has strong side effects—problems that make it impractical for emergency use. Henderson, for one, sees no immediate application. But Huggins is collaborating with NIAID and CDC in a search for analog drugs, says Loughlin, although Huggins is “doing all the work.”

    While USAMRIID is pleased to have collaborators in its traditional line of research, long-time workers in the field wonder how long the enthusiasm will last. Mindful of the ephemeral quality of such policy initiatives, reporters asked Shalala last week at what point the government's antibioterror program would reach its objectives. Shalala shot back: “This is not a quick response. … I will never say we have done enough.”

    • * National Symposium on Medical and Public Health Response to Bioterrorism, 16 to 17 February, in Crystal City, Virginia, sponsored by Johns Hopkins University, HHS, the Infectious Diseases Society of America, and the American Society for Microbiology.


    Yucca Mountain Panel Says DOE Lacks Data

    1. Richard A. Kerr

    With just 2 years to go before deciding whether Yucca Mountain in southern Nevada should be a permanent home for spent fuel from the country's nuclear power plants, the U.S. Department of Energy (DOE) has run into another snag. In a report* submitted 2 weeks ago, a panel of experts says major questions about the controversial site are still unanswered and casts doubt on DOE's ability to make a final decision in 2001.

    Congress chose Yucca Mountain as the sole site to be studied as a high-level radioactive waste repository in 1987, and DOE has spent $6 billion toward reaching that goal. While waste piles up in 72 temporary facilities, political and legal battles have pushed back its original start-up date of 1998 to the current target of 2010. In December, the department announced that the latest study, an assessment of the remote mountain's ability to entomb the waste safely for thousands of years, had identified “no show stoppers.” Although safety questions remain, DOE officials said then, they were confident that the repository “would protect public health and the environment for thousands of years.”

    But on 11 February, a blue-ribbon panel of six experts hired to peer review the agency's study raised doubts about that conclusion in what the panel calls a “highly critical” report. “There's a lot to be done” before DOE can make such a prediction, says panel chair Chris Whipple, a risk assessment engineer at ICF Kaiser Engineers Inc. in Oakland, California. “Can they do it on their current schedule? That seems unlikely.”

    DOE is taking the panel's report in stride. “I think they overstated [the uncertainties] a bit, [but] it's what we paid for,” says Abe Van Luik, senior technical adviser for performance assessment in the Yucca Mountain Project. “We're taking them seriously.” DOE is still aiming for a decision in 2001, he says.

    The report faults the department's current model for predicting the repository's behavior, which takes into account everything affecting the movement of radioactive elements out of the fuel rods and into the distant environment over millennia. The panel agrees that DOE has done a good job assessing such possible disruptions as earthquakes, volcanic eruptions, and nuclear reactions suddenly taking off, but notes that other assumptions “may be unduly optimistic.” For example, the cladding that encases the enriched uranium rods and provides the first line of defense may not hold up as well as assumed. More lab work on the cladding's behavior under repository conditions is needed, says the report.

    The behavior of the radioactive material once it leaks out, as it eventually must, is also unclear, says the report. More exploratory holes should be drilled into aquifers far from Yucca Mountain, where the radioactivity will ultimately spread, it suggests. The panel is especially concerned about the assumptions behind the repository's “hot” design, in which heat from the waste is supposed to keep temperatures well above boiling and thus initially keep out moisture that could corrode the rods. “We don't think anybody can model that convincingly,” says Whipple. Such stubborn problems might be handled by making some conservative, simplifying assumptions, says Whipple, an approach DOE has yet to accept.

    Van Luik says he's “a tad surprised at the amount of material they think we need to do.” Some of the suggested work is already under way, he notes, and project staff are still debating the merits of a hot design. “This is not our final design, nor our final understanding of the site,” he explains. But he's concerned by the fact that “the panel recommends that we do additional work that would extend us beyond our current schedule.”

    Kevin Crowley, staff director of the National Research Council's Board of Radioactive Waste Management in Washington, D.C. says DOE would be wise to take the panel's advice because its current schedule is unrealistic. The panel's emphasis on gathering more data and dealing with the intractable complexities, he adds, could be key to resolving the technical issues. “The DOE has some real challenges ahead,” he warns.

    • * “Final Report, Total System Performance Assessment Peer Review Panel.” For a copy, see


    Stress Profiling Gets The Best Out of Glass

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    Try to bend a piece of window glass, and you'll get a vivid demonstration of glass's brittle behavior. When stressed, it shatters without warning into thousands of shards. Now an international team of researchers has developed a clever way to make glass a little more pliant and predictable. On page 1295, materials scientist David Green of Pennsylvania State University in University Park and his team describe a chemical toughening process that resulted in glass that both resists fracture better and delivers a warning before it finally fails, in the form of small cracks on its surface.

    “Usually, when a piece of glass starts to break, that's the end of the story. In this glass you can arrest the cracks and you get some warning before the final failure,” says Green. “The fact that multiple cracking can be observed in glass is indeed remarkable,” agrees William Tredway, advanced ceramics group manager at United Technologies Research Center in East Hartford, Connecticut.

    The traditional method for making glass more resistant to fracture is called tempering. Manufacturers use either heat or chemicals to increase the “residual stress”—the compressive forces between atoms—at its surface. Before an external stress forms a crack, it must overcome not only the normal strength of the material, but also this extra residual stress. Tempered glass is more resistant to fracture, but when a crack does form at the surface it quickly moves deeper where the stress is lower, and the material fails catastrophically.

    In 1991, Green and Rajan Tandon, now at Caterpillar Inc. Technical Center, a construction machinery manufacturer in Peoria, Illinois, did theoretical studies that pointed to a better way to strengthen glass. The studies showed that a compressive stress “profile,” with relatively weak stress at the surface increasing to a maximum at a depth of 20 to 30 micrometers, would stop cracks because they would face increasing compressive stress as they moved deeper into the material. “The idea went against the current dogma of what you are supposed to do,” says Green. “Usually people try to get the maximum compression at the surface.”

    To create the required stress profile, the researchers developed a two-stage chemical tempering process. The main skeleton structure of glass is composed of silicon and oxygen atoms, interspersed with sodium atoms. The researchers immersed a glass sample in a bath of molten potassium nitrate at high temperature, allowing some of the potassium ions in the bath to swap places with sodium ions in the glass—a process called “ion exchange.” Potassium atoms have a radius that is 25% larger than sodium, says team member Vincenzo Sglavo of the University of Trento in Italy. “This causes compressive stress in the material.”

    This first stage of the procedure is much like traditional chemical tempering. But in a new twist, says Sglavo, the researchers then reversed the direction of ion exchange. They briefly immersed the glass in a mixture of molten sodium and potassium nitrate. Some of the potassium ions migrated from the glass back out into the bath. The result was glass with a very thin surface layer containing sodium atoms and deeper layers richer in the potassium deposited by the first treatment. “Right at the surface some of the compressive stress is released,” says Sglavo.

    When the researchers tested their glass samples under increasing loads, they found that their strength was up to 5 times that of typical window glass. Sglavo reports that they could flex a 10-centimeter piece of glass by more than 1 centimeter in its center. “Usually you break it,” he says. And when they flexed it, they observed small cracks forming on the convex surface. “This is the indication of a critical condition like you see in plastic or in metals,” says Sglavo. Although glass tempered with such a stress profile would cost more than normal window glass, the researchers believe such a “safer” glass would be very valuable for certain purposes, such as car windscreens. The computer industry would also welcome thinner, stronger glass for light-weight displays, says Green.


    Chlamydia Protein Linked to Heart Disease

    1. Trisha Gura*
    1. Trisha Gura is a science writer in Cleveland, Ohio.

    Nature has its share of copycats, which rely on deceit to escape predators: insects that look like the sticks they walk on, frogs disguised as leaves, harmless butterflies that model themselves after their poisonous cousins. Even microbes disguise themselves with proteins that mirror those of their host as a way of evading detection by the immune system. But such molecular mimicry may harm the host as well as protect the microbe by causing the immune defenders to mistakenly turn on the body's own tissue. Over the past year, investigators have implicated molecular mimicry in an eye disease and chronic Lyme arthritis, and now in one of the most common serious illnesses: heart disease.

    On page 1335, a team led by immunologist Josef Penninger of the Ontario Cancer Institute and the Amgen Institute at the University of Toronto reports that the bacterial pathogen Chlamydia makes a peptide that mimics a portion of a heart muscle protein. In mice, the bacterial peptide can cause immune sentries known as T cells to attack the heart muscle, triggering a severe inflammation. If something similar occurs in human beings and the inflammation also plays a role in the formation of the artery-clogging plaques of atherosclerosis—two big ifs—the work may provide a molecular explanation for a long-suspected link between infections and heart disease.

    So far, the evidence for that link has been circumstantial: a stream of studies associating cardiovascular disease with infection by agents including Chlamydia (Science, 3 July 1998, p. 35), plus a report in the 3 February 1999 issue of the Journal of the American Medical Association that antibiotic use reduces the risk of heart attack. But researchers have had little idea about how infections might lead to heart problems. The new study, says epidemiologist Hershel Jick at Boston University, who co-authored the JAMA report, could “give us an important piece of the puzzle in the story of infection and heart disease.”

    Penninger and Kurt Bachmaier, a postdoc in his lab, had previously shown that injecting a fragment of the heart muscle protein myosin into mice causes severe inflammatory responses in the animals' hearts. To Penninger, this suggested that the immune system was mistaking the heart peptide for something foreign—perhaps a peptide in a microbe to which the mice had previously been exposed. To come up with a likely suspect, the team plugged the sequence for the offending peptide into sequence databases.

    The researchers expected to find a related peptide in something like the Coxsackie B3 virus, long known for infecting heart muscle. But to their surprise, the sequence of the myosin fragment closely matched that of a peptide found in three strains of C. trachomatis, the culprit in sexually transmitted diseases. They also found similar, but not identical, peptides in C. pneumoniae and C. psittaci, known to cause respiratory infections.

    The researchers soon confirmed that injecting the Chlamydia peptides, together with an immune booster called Freund's adjuvant or with the microbe's own DNA, into mice provokes heart inflammation, caused by T cells infiltrating the heart muscle. The vigilant immune cells, when activated in a mouse injected with the Chlamydia proteins and then transferred to another mouse, could also cause the same heart-destroying response in that animal.

    But perhaps the most crucial evidence of all was the finding that live bacteria pumped by catheter and syringe into the noses or genital tracts of mice caused subsequent heart inflammation. The researchers also found that the mice made antibodies to the bacterial and heart peptides, and also to a third peptide from another heart protein—a telltale sign of an overzealous immune response, the researchers say. “We have proven that a bacterial infection in the genital tract or lungs can lead to cardiac inflammation,” Penninger says. “Our paper takes this out of the realm of epidemiology and really says this is a causal link of how Chlamydia could work to cause heart disease.” It also implies that preventive treatment with antibiotics might thwart some cases.

    Other researchers are cautious, however. Cardiologist Brent Muhelstein at the University of Utah in Salt Lake City and others note that the inflammation resulting from the mimicry seems confined to the heart muscle itself rather than extending to the arteries, where it could trigger the plaques characteristic of atherosclerosis. In addition, C. trachomatis, which evokes the strongest response in Penninger's study, hasn't yet turned up in atherosclerotic plaques.

    And even if Chlamydia infections are involved in human heart disease, researchers will want to know why many people escape the problem, even though Chlamydia infections are very common. One possibility, of course, is that such well-established risk factors as smoking and high blood cholesterol concentrations also influence how the body responds to Chlamydia.

    The only way to tell if the microbe is triggering heart disease through molecular mimicry, Penninger says, is to do epidemiological studies to see if people who have antibodies against the bacterial peptide have a higher rate of the disease. Boston University's Jick agrees. “One of the obvious limitations is that, so far, the effect has been shown only to occur in mice,” he says.


    Fruit Fly Odor Receptors Found

    1. Elizabeth Pennisi

    Although researchers identified the receptors mammals use to detect odors almost a decade ago, they've been unable to sniff out those of any insect. Now, the impasse has been broken. Two teams, one led by John Carlson of Yale University and the other by Richard Axel of Columbia University, have independently discovered the first odor receptors in the fruit fly, Drosophila melanogaster.

    The work, described in the February Neuron by Carlson and his colleagues and in the 5 March issue of Cell by Axel and his, has so far pulled out a total of 17 genes encoding Drosophila odor receptors. Given that these came out of the 15% of the Drosophila genome that has been sequenced, the insect may have 100 to 200 odor receptor genes in all.

    Their discovery will be a boon to neurobiologists, who hope to use the information to probe the more complex workings of mammal brains. By systematically knocking out the fly genes and observing the effects on odor sensitivity and behavior, researchers should be able to piece together a wiring diagram of the olfactory system of the fruit fly. “One can expect in the next few years that a lot will be discovered, providing important new insights into olfaction and probably into sensory coding,” predicts Harvard University neurobiologist Catherine Dulac.

    The first payoff, however, may be explaining how other insects behave. Already, researchers are using the sequences of the newfound Drosophila genes to track down odor receptors in insects that damage crops or transmit human diseases. Having these receptors in hand will make it much easier to find specific compounds that interfere with the insects' ability to detect odors. Because insects depend on smell to find mates and food, such substances could “really enhance our ability to control insect pests,” notes Tim McClintock, a neurobiologist at the University of Kentucky College of Medicine, Lexington.

    The key to success for both the Yale and Columbia groups was finding the first olfactory receptor gene. For years, others had tried to find these genes by looking for fruit fly genes whose sequences resemble those of known mammalian odor receptor genes. But those searches all came up empty. “These guys came up with a better way,” says neurobiologist Dean Smith of the University of Texas Southwestern Medical Center, Dallas. They used a new method to search a growing fly DNA data set: the sequences accumulated by the Berkeley Drosophila Genome Project.

    Aware that the odor receptor proteins would have to be embedded in the membranes of olfactory nerve endings, Yale's Peter Clyne, Junhyong Kim, and Coral Warr first looked for DNA sequences in the Berkeley data that might encode transmembrane domains, strings of hydrophobic amino acids that can tolerate insertion into fatty membranes. They then eliminated the nonsense DNA and the known genes. This got them down to 34 candidates. Two turned out to be elusive odor receptor genes, as evidenced by their location in the olfactory neurons.

    At Columbia, Leslie Vosshall and her colleagues found their first Drosophila odor receptor gene by searching for genes that are active only in the fruit fly olfactory organs, the antennae, and a rod-shaped projection on the head called a maxillary palp. The researchers did this by comparing messenger RNAs (mRNAs), which indicate active genes, from the olfactory organs with mRNAs from the whole body and the head. Vosshall gradually homed in on a small set of genes, which she could begin sequencing and testing whether they are active only in the olfactory sensory nerve cells. She found one such gene, and like the Carlson team, used it to find related genes in the sequence database.

    Both groups now have clues about how the fruit fly brain perceives odors. They've shown that the genes are expressed differently in the various olfactory nerve cells. These data suggest that fruit flies, like vertebrates, may discriminate odors by decoding patterns of nerve activation that reflect the responses of many individual cells, each attuned to a single sensation.

    Carlson's team also learned something from flies with a damaged sense of smell. In a separate study, they found a defective gene in the flies that codes for a protein that regulates gene expression. The defect appears to turn off certain receptor genes in some olfactory nerve cells. “The fact that some receptors are gone is pretty cool,” Smith says, as it suggests this DNA regulatory protein helps set up the pattern of gene activity—and odor sensitivity—in the fly. In addition, as in vertebrates, Carlson notes, various fly odor receptor genes appear to be active at different times during development and may help organize the olfactory system.

    He and his team plan to continue to look for more odor receptor genes and try to understand how these genes are regulated. “I feel like a kid in a candy store,” Carlson says. “There's a million things we can now do.”


    Kennewick Man Gets His Day in the Lab

    1. Constance Holden

    More than 30 months after a 9000-year-old skeleton was found on the banks of Washington's Columbia River, a government-appointed team of scientists has begun an examination to decide, once and for all, whether Kennewick Man qualifies as a Native American. Scientists are happy that the skeleton has made it into the lab, but they are worried that the government could put a crimp on the way the work is done and reported.

    On 17 February, the Interior Department announced that five scientists have been appointed to help Frank McManamon, chief archaeologist of the National Park Service, perform a systematic analysis of Kennewick's 300-plus bone fragments. The work is being done at Seattle's Burke Museum, where the bones have resided, per court order, since last October.

    The skeleton already has a colorful history (Science, 10 April 1998, p. 190). Discovered in July 1996, the bones were seized by the Corps of Engineers on behalf of Native Americans under a 1990 law, the Native American Graves Protection and Repatriation Act. A group of scientists promptly sued for the right to study the remains, which have been claimed by a coalition of five tribes as well as a group representing ancient Nordics. A Portland court order kept them above ground but in limbo at Pacific Northwest National Laboratory in Richland, Washington, where tribal groups visited them occasionally for ritual purposes.

    The Corps has turned decision-making over to the Interior Department, and last spring a federal judge gave the department permission to proceed to examine evidence bearing on Kennewick Man's identity. Now he's finally ready for a full inspection. The first step involves scrutiny of bones, teeth, soil samples, and a rock spearpoint embedded in the pelvis to establish whether the man fits the law's definition of a Native American, which includes the words “of, or relating to, a tribe, people or culture that is indigenous to the United States.” McManamon says the term applies to anyone found to be in an area before the Europeans got there, and it's not necessary to find “a biological link between modern tribes and ancient remains.”

    If the findings are inconclusive, McManamon says scientists will check with Native Americans, with whom they have been consulting continuously, before applying invasive procedures such as radiocarbon dating of bones and attempts at DNA analysis. If Kennewick is deemed to be a Native American, says McManamon, scientists will then explore the archaeological record and local tribal histories to establish whether he has “cultural affiliation” with any modern tribe.

    Scientific observers are gratified that scientists are finally able to get at the bones, but they're upset with Interior's procedures and its definition of Native American. The term, by that interpretation of the law, encompasses “Norse remains in Maine … and a lot of Japanese shipwrecks,” complains anthropologist Robson Bonnichsen of Oregon State University in Corvallis, one of the plaintiffs in the court case. He and others contend that the law requires a relationship to present-day populations. Bonnichsen is also dubious that the government will find any connection with modern-day Indians given that the only artifact is a spearpoint, and the skull—which looks like a cross between a Polynesian and a member of Japan's Ainu tribe—is quite different from those of today's Native Americans.

    Interior spokesperson Stephanie Hanna says a preliminary report from the expert team is due in mid-March and data will eventually be made public. The remains will ultimately be given to the tribe in question if a cultural affiliation is established. But the scientists' suit is only “on hold,” says plaintiffs' lawyer, Alan Schneider, pending the outcome of the current exercise.


    Culture Collections Seek Global Help

    1. Dennis Normile

    TOKYO—The explosion of interest in biodiversity has generated a wave of popular support for preserving and cataloging the world's plant and animal species. But their less flashy brethren—organisms such as yeasts, bacteria, fungi, and cell lines—haven't been able to capitalize on that interest. “The collections are taken for granted,” laments Jennie Hunter-Cevera, a microbiologist at Lawrence Berkeley National Laboratory in California. “What you cannot see, you don't value.”

    Help may be on the way, however. The Organization for Economic Cooperation and Development (OECD) is expected to launch a major study later this year that could lead to an international agreement on preserving these biological materials. Scientists and government officials who gathered here last week for two meetings* asked for OECD's help, and the idea has been embraced by officials at the Paris-based grouping of the world's 29 leading industrialized democracies. “There is a growing awareness among policy-makers beyond the scientific community that biological resource centers are vital … for future research in the life sciences and biotechnology,” says Solomon Wald, who heads the OECD's working group on biotechnology. Adds Alan Doyle, a microbiologist with the Wellcome Trust in London: “It takes some wider group with enough influence to get policy changes, and the OECD seems to be the right vehicle for this.”

    The working group on biotechnology is expected to adopt the idea at its spring meeting. It would then probably form a task force to study six aspects of managing such collections: access and distribution; quality assurance; efficiency and the avoidance of duplication; funding and sustainability; education, training, and research; and networking. Wald says the task force, with experts from interested countries, will develop policy principles that could form the basis for an international agreement to promote the health of the biological resource centers. The study is expected to take a year or more.

    Researchers say that such help is badly needed because efforts by the scientific community have fallen short. Many present-day collections trace their roots to a dedicated individual or group that collected specimens for academic interest, or to companies that collected materials for commercial purposes. Brewers, for example, often started collections of yeasts. But regardless of their origin, most of the major collections are now under the wing of universities or public institutions. Perennially short of money, they face major challenges in keeping up with the times.

    One of the most pressing needs is to improve, or in many cases to establish, the capacity to manage the rapid accumulation of genetic information about their holdings. “The centers are being crushed by the volume of new strains and the avalanche of genetic information,” says Hideaki Sugawara, an informatics specialist at Japan's National Institute of Genetics in Mishima. Adds Raymond Cypess, president of the American Type Culture Collection (ATCC) in Manassas, Virginia, which holds over 78,000 material samples, “[the centers] are really information resources.”

    Accordingly, meeting participants strongly backed the creation of what Wellcome Trust's Doyle calls “a virtual biological resource center.” The idea is to put online all the genomic and functional information on the holdings of individual collections, with search capabilities for researchers worldwide. But there are daunting obstacles, chief among them the many different database and classification standards already in use.

    Aside from informatics, curators are also struggling to find the right balance between unnecessary duplication of holdings and the need to keep key strains at each center for logistical and strategic reasons. There is also debate over whether a self-sustaining not-for-profit corporation, such as the ATCC, or a publicly funded institution, such as is common in Europe, offers a better model for the long-term viability of a collection.

    Participants don't expect the OECD task force to come up with all the answers. But Doyle says he'll be satisfied if the effort contributes to “the future stability of culture collections.”

    • * OECD Workshop on Scientific and Technological Infrastructure, Tokyo, 17 to 18 February, cosponsored by Japan's Ministry of International Trade and Industry; Microbial Resources Centers in the 21st Century: New Paradigms, Tokyo, 16 February, sponsored by World Federation of Culture Collections.


    Health Research Gets Fundamental Overhaul

    1. Wayne Kondro

    OTTAWA—It's rare for the head of a major government research organization to applaud politicians for abolishing his agency. But for Henry Friesen, president of the Medical Research Council (MRC) of Canada, last week's announcement that the MRC would be replaced in a year's time by a new Canadian Institutes of Health Research (CIHR) marks a major step forward in his effort to fundamentally change the nature of Canadian biomedical and health research.

    Friesen first proposed CIHR as a way to create a national network of “virtual” research institutes. His hope was that the concept might persuade the government to pump more money into health research (Science, 8 May 1998, p. 821). Now the idea has emerged as a key element in a “health and welfare budget” that Finance Minister Paul Martin unveiled last week. That budget, for the fiscal year beginning 1 April, takes advantage of a projected surplus to commit $325 million more for a grab bag of research initiatives that includes bolstering Canada's space program and expanding programs to renovate aging academic labs and foster collaboration with industry.

    An elated university community is praising the CIHR initiative, which would build on work at the country's 16 academic health centers. Like the MRC, the CIHR will be in the business of issuing extramural research grants. But the science it supports will encompass more health services and population-based research than did its predecessor, which focused primarily on biomedical research.

    View this table:

    Through external advisory boards, Canadians will have a greater say in determining the type of projects to be supported, says Association of Universities and Colleges of Canada President Robert Giroux. “It's also coordinating and maximizing what's being done in all areas,” he adds. Canadian Medical Association President-elect Hugh Scully sees CIHR as a shot in the arm for the entire Canadian health care system, and University of British Columbia President Martha Piper notes that “being able to network our brightest minds across many labs and institutions is really quite strategic.”

    But while the expectations for CIHR may be great, the initial funding is relatively modest and falls well short of the $325 million-a-year boost in health research funding that proponents had requested during a year-long campaign. When the CIHR opens its doors next year, it will receive $219 million, a $39 million supplement to MRC's base budget. An additional $72 million would come in fiscal year 2001–02. Friesen, who is in line to head the new institutes, says that the government is telling researchers “to walk before we give you sufficient funds to run at top speed.”

    Indeed, the next 12 months will be anything but a stroll in the park for CIHR. A task force appointed by Health Minister Allan Rock and headed by Friesen will debate how the organization will be structured, where institutes will be based, and what they will concentrate on. It will also decide whether to roll under the CIHR umbrella roughly $70 million a year in health research now being conducted by the Natural Sciences and Engineering Research Council (NSERC) and the Social Sciences and Humanities Research Council. Federal officials anticipate a year of “immense” and “intense” negotiations.

    University administrators also give thanks for a $130 million boost to the $520 million Canada Foundation for Innovation, which this spring expects to award its first major grants for projects aimed at rejuvenating an aging research infrastructure (Science, 28 February 1997, p. 1256). CFI President David Strangway says he hopes the new money will generate “more imaginative” applications. The government also gave the Networks of Centers-of-Excellence program a 65% raise, to $50 million. NSERC President Thomas Brzustowski said the monies will allow for as many as eight new centers linking researchers across campuses in joint projects with industry.

    NSERC itself received an unexpected $16 million increase in its $305 million budget. The new budget also includes $156 million more over 3 years for the Canadian Space Agency, which had threatened to withdraw from long-term participation in the international space station. The government has promised to stabilize its budget, now $220 million but falling rapidly as several projects conclude, at $195 million.

    Ottawa was less responsive to a proposed 5-year, $175 million national genomics initiative (Science, 3 July 1998, p. 20), which finance officials nixed after deciding that it lacked “maturity.” And it gave the National Research Council only $3 million of its $16 million-a-year request to recover from 3 years of budget cuts (Science, 18 September 1998, p. 1781).


    First Food-Borne Pathogen Sequenced

    1. Elizabeth Pennisi

    Nature lovers used to marvel over avian ingenuity when they saw a bird peck its way through a foil bottle cap for a sip of milk. The clever bird sometimes left an unwelcome present in return: Campylobacter jejuni, a bacterium that causes severe gastrointestinal upsets in humans. Milk rarely comes in bottles these days, but Campylobacter has become a major health problem over the past 20 years, often passing from its natural avian hosts to humans through undercooked poultry or contaminated water. Now, C. jejuni has a new—and more auspicious—claim to fame: It's the first food-borne pathogen whose genome has been sequenced.

    At the Microbial Genomes III meeting held earlier this month in Chantilly, Virginia, microbiologist Brendan Wren of St. Bartholomew's Hospital in London reported that a team lead by Bart Barrell and Julian Parkhill at the Sanger Centre in Cambridge, U.K. has determined the exact order of the 1.64 million bases that make up the pathogen's genetic code. The sequence has already revealed how C. jejuni might evade immune system detection, information that might help researchers develop vaccines to protect against the bacterium, which last year caused nearly 300,000 cases of food poisoning in the United States alone.

    It is also shedding light on an occasional aftermath of C. jejuni infections: a temporarily paralyzing neuromuscular disorder called Guillain-Barré syndrome, thought to be an autoimmune reaction touched off by the bacterium. What's more, because C. jejuni is a close relative of Helicobacter pylori, which causes ulcers, comparing the two genomes should help researchers better understand that pathogen as well. “The Campylobacter sequence is going to help the field no end,” predicts Richard Alm, a microbiologist at Astra Research Center in Boston, Massachusetts.

    Microbiologists have found C. jejuni difficult to study because it grows poorly in the lab. But sequencing it proved much less of a problem, taking less than 16 weeks from start to finish. “It sequenced like a dream,” Wren said, opening up an entirely new view of the organism.

    Not all of C. jejuni's potential genes have been identified yet, but those that have may solve some puzzles about the organism. For example, the Sanger group discovered repeated sequences of either guanine or cytosine bases—anywhere from 7 to 13 copies of each—in 25 of the microbe's genes. Such repeats are not unusual, but in this case they helped Wren and his colleagues see how the bacteria might evolve to evade the host immune system.

    These repeated sequences are particularly prone to mutation when the bacteria replicate their DNA before dividing, because in those regions the strand being synthesized may slip relative to the one being copied, with the result that bases are lost or gained depending on the direction of the slippage. And Wren and his colleagues found that the same gene could contain, say, nine guanines in a row in one sample and 13 in another, changes that could affect the structure and activity of the gene's protein product.

    These mutations primarily affect genes that help produce lipopolysacharides, the sugars that coat the surface of C. jejuni. By frequently altering these genes, C. jejuni may change how its surface looks to the immune system and may thus avoid recognition by antibodies made during previous infections, suggests Julian Ketley, a microbiologist at Leicester University, U.K. The sequence also revealed what may be another countermeasure in C. jejuni: three not-quite-identical copies of a gene called NeuB. These genes, which make proteins that help cause acidic sugars to be added to various other molecules in a process called sialation, might help disguise bacterial components so that they look more like those of the host and are thus harder for the immune system to detect.

    Besides shielding the bacterium from an immune response, these similarities could cause trouble when the immune system does succeed in recognizing the camouflaged molecules. Wren reported preliminary experiments suggesting that one NeuB gene may cause a surface molecule on C. jejuni to look like a ganglioside, a type of lipid found in high concentrations in the nervous system. That close resemblance could trick the immune system into attacking nervous tissue as well as the invading bacteria, perhaps causing Guillain-Barré syndrome.

    So far, however, the genome hasn't provided many other clues about how the microbe does its dirty work. For example, researchers thought a toxin similar to that made by the cholera pathogen might be the cause of the diarrhea and other symptoms caused by C. jejuni. “But there doesn't seem to be any evidence of that,” says Ketley, who has teamed up with several other U.K. researchers to identify all of C. jejuni's proteins and their functions as a way of pinning down the source of its virulence.

    In the meantime, Wren and others have begun comparing the C. jejuni sequence with that of the closely related H. pylori. Until now, “no one had sequenced different, side-by-side species,” Alm notes. The differences are surprisingly large, he adds: “The genomes are indistinguishable by size and yet 17% of the genes are specific to Helicobacter.

    Some of the differences appear to be related to the different lifestyles of the two organisms. For example, H. pylori settles only in the stomach and has several genes that appear to help it cope with the stomach's acid environment by coding for enzymes that break down urea. This “may create an alkaline cloud around the Helicobacter,” Wren explains. C. jejuni, for its part, has about twice as many genes as H. pylori that are involved in sensing and initiating coordinated responses involving multiple genes. These presumably enable the microbe to adjust to a new environment, be it the gut of a bird, milk in a bottle, or a human intestine.

    Over the next few years, Wren and his colleagues will study the role of the newly identified genes by using DNA microarrays, glass slides spotted with DNAs representing all of C. jejuni's genes, to see which genes are active over the course of an infection. In the near future, Wren predicts, “[C. jejuni] will go from being one of the least well-studied pathogens to one of the most well-studied ones.”

  10. Death by the Numbers

    1. David Kestenbaum

    Wall Street's math wizards got crushed last summer when a global panic broke out. Are they playing a zero-sum game, or will they rise to power again?

    A physicist-turned-Wall Street trader is on the phone. It's late at night, and he's at home. Telephone lines at his office are tape recorded, principally to document trades, but it has a chilling effect that persists after he gets home. “This has to be anonymous,” he begins. A source in New York agrees to meet for a hushed conversation over breakfast but later warns, “If you attach my name to any of this, I will make it my life's mission to hunt you down.”

    Welcome to the clandestine world of math and money, known as arbitrage. Strictly speaking, an arbitrage trade is one that earns money without risk or effort—if gold is selling for two different prices, for instance, you can buy low, sell high, and pocket the difference. A more modern definition would cover any trade that tries to profit from small market anomalies. Unlike most people, whose fortunes rise and fall with stock or bond prices, arbitrageurs ride the internal dynamics of the market, with trades artfully constructed so that, in principle, they make money whichever way the Dow goes. “We laugh at those guys who bet on which direction the market will move,” one says.

    Sometimes arbitrageurs make speculative bets, wagering, perhaps, that two companies will merge and their stock prices converge. But more often, the bets are highly mathematical, based on sophisticated computer programs that sniff for prices that are minutely out of joint. When one is found, traders borrow huge amounts of money and bet that the price will come back into line. As a well-known arbitrageur and former Columbia University computer scientist, David Shaw of D. E. Shaw & Co. once put it, arbitrage is “the search for 4.8 cent nickels.”

    Arbitrage may sound like a high-tech get-rich scheme, but it's practiced on huge scales by major investment banks and some “hedge funds,” unregulated shops that try to turn large profits for wealthy investors. Hundreds of quantitative economists, mathematicians, and former physicists have been putting their skills to work in these places, picking apart the markets with the same tools used to model heat flow in nuclear reactors or solve abstract mathematical puzzles. The arbitrage culture is secretive: If the models leaked out, there would be too much cash chasing too few opportunities and mispricings could dry up.

    The strategy seemed foolproof—until last summer's Russian financial crisis. On 17 August, Russia defaulted on its debt and triggered a global panic. Investors sold whatever they were holding and bought the safest things around, typically U.S. treasuries and German bonds. In the “flight to quality,” tiny price anomalies flew wildly out of whack. “All of the classical relations that you rely on went out the window,” says Leslie Rahl, an analyst with Capital Market Risk Advisors, a consulting firm in New York.

    The most widely publicized casualty was Long-Term Capital Management (LTCM), an elite hedge fund run out of Greenwich, Connecticut, that until then had seemed as solid as a bank vault, and one with a magical ability to breed money. If you'd invested with this company back in March of 1994, your money would have nearly tripled in value by the end of 1997. The firm was founded by trading legend John Meriwether and backed by the brains of two Nobel Prize winners and numerous finance Ph.D.s. “It was almost like that western The Magnificent Seven. The dream team was assembled, and they were just going to ride over the opposition,” says Nicholas Dunbar, an editor at RISK magazine in London, which publishes mathematical finance papers.

    By the end of the financial hurricane, however, LTCM had lost billions, and hemorrhaged 90% of its assets. Then word leaked out about problems elsewhere. In October, BankAmerica Corp. announced that it had lost $372 million on an investment with D. E. Shaw & Co. Some investment banks also took hits. “I happen to know a lot of guys who got fired,” says one trader.

    Many outsiders were quick to blame the catastrophe on mathematical hubris. Federal Reserve chair Alan Greenspan soberly remarked to a congressional committee, “No matter how skillful the trading scheme, over the long haul, abnormal returns are sustained only through abnormal exposure to risk.” Business Week magazine's September cover read simply: “MISFIRE: Wall Street's rocket scientists thought they had a surefire way to beat the markets. Boy, were they wrong!“

    But many traders interviewed by Science say the mathematical approach is a sound way to spin the market's inefficiencies into money, although it may need tuning as the marketplace becomes global. Some academics agree that the game is fundamentally profitable. “What this has demonstrated is that they need better models,” says Andrew Lo, director of the Laboratory for Financial Engineering at the Massachusetts Institute of Technology (MIT). Indeed, the math wizards are already reemerging. The shakeup may even be good news, Lo says: “It just creates more opportunities in the market.”

    Fishing for loopholes

    In a well-oiled market, most arbitrage opportunities are fleeting and hard to spot. Not always, though: A few years ago, the Italian post office inadvertently opened up a hole for arbitrageurs when it issued bonds that were substantially cheaper than equally secure bonds backed by the Italian treasury. Traders pounced. They realized they could buy the postal bonds, set up a skeleton off-shore company whose only assets were the bonds, sell shares of the company for about the price of a treasury-backed bond, and pocket the difference.

    According to an insider, representatives from major investment banks arrived at the Milan post office with cashier's checks for billions of dollars (one even had armored trucks filled with cash as a backup) to buy the bonds. Such simple arbitrage openings are rare, and they quickly evaporate once they're found. There are, however, more subtle opportunities. Many of them lie buried beneath the surface, in the prices of things called “derivatives.”

    A derivative is a financial instrument whose value depends on (is “derived” from) something else, often the price of a stock or bond. Simple examples are “call options” or “put options,” which are contracts giving the owner the right to buy or sell something at a fixed price at a future time. An investor might buy a put option giving her the right to sell a share of IBM for $100 on 1 August (the “expiration date”) as a kind of insurance policy. That way, if IBM's stock plummets, she can exercise the option and dump the stock at a reasonable price.

    Options sound simple, but for years no one could figure out how much they should cost. If IBM ends up above $100, the put option is worthless. If IBM ends up at $90, the option is worth $10. But what's it worth a month before the expiration date when IBM is $95 a share? Tough question. Figuring out how to price an option was the Gordian knot of finance.

    The breakthrough came in the early 1970s when economists Fischer Black and Myron Scholes discovered a clever link between options, stocks, and interest rates. The idea was this: Imagine buying a hypothetical put option that expires almost immediately. If you also buy just the right amount of the underlying stock, you are immune to what happens to the market in the next breath. That's because the combination is “risk-free”—if the stock falls a bit in value, the option is worth more, and vice versa. Since this hypothetical portfolio is risk-free, Black and Scholes reasoned, it should earn as much as an (essentially risk-free) treasury bond backed by the government. That meant you could take treasury bond interest rates, the current stock price, and calculate what this short-lived option should cost. That was half the eureka.

    To price a real option, whose expiration date was a ways off, required one more thing—a model of how stock prices move over time. Black and Scholes assumed the prices display what physicists call “Brownian motion.” Like a raindrop falling in the wind, a stock price has a drift but also wiggles around. The wiggle, called the “volatility,” can be estimated from historical data. Mix all this up with a little stochastic calculus and you have the Black-Scholes Equation—at the time the fanciest bit of math ever to make it into a trader's calculator. It wasn't perfect, but it gave investors a way to price options and the confidence to trade them in huge quantities.

    Today, investors often buy options on stocks, interest rates, and commodities as insurance policies, or as cheap bets on which way the market will move. The pricing breakthrough won Scholes and Harvard University economist Robert Merton, who extended the work, the Nobel Prize for economics in 1997. Both became central figures at LTCM.

    With the new math came new opportunities to make money. Some firms started to specialize in pricing derivatives, making improvements to the Black-Scholes Equation and combing the market for derivatives that were selling for too much or too little. “This was real serious mathematics,” recalls David Weinberger, a former Bell Labs mathematician who was the managing partner for the Chicago firm O'Connor & Associates in the 1980s. “We had a quantitative research group of about 40 people, 20 to 25 of whom had Ph.D.s in physics, electrical engineering, or math,” says Weinberger. On a big trading day, O'Connor's maneuverings accounted for an enormous 5% of the volume on the New York Stock Exchange. “We made a lot of money,” Weinberger says. The race for new ways to turn math into money was on.

    All in the math

    One opportunity arbitrageurs spotted lay in an offspring of the Black-Scholes Equation: volatility. Because volatility is folded into the price of an option, it can be traded like corn or anything else. And although the actual price of the stock might wander randomly, volatility—the range of its daily excursions—seemed mathematical and maybe even predictable.

    To make predictions about a stock's volatility, arbitrageurs study how it has changed in the past. A pharmaceutical stock may have periods of wild speculation when a new drug goes into clinical trials, but quickly revert to more quiet meandering. This history is fed into models with names such as GARCH, or Generalized Autoregressive Conditional Heteroskedasticity. “Don't ask,” says one expert. “It's pretty hairy.” Modelers also throw in a variety of other factors to try to narrow the forecast, and they compare volatilities of similar stocks to see if one seems out of sync. Traders don't buy blindly, however. “You have to be a little concerned when things appear out of whack,” Weinberger warns. “You have to say ‘Why is this game available to be played?’”

    Trying to place a bet on volatility alone is a little tricky, since changes in the underlying stock price can also affect an option's value. To remain independent of the overall lurch of stock movements, arbitrageurs go “long” (buy) the underlying stock or go “short” (borrow the stock, then sell it, promising to replace it later). A short position, for instance, rises in value if the stock goes down, since you can buy it cheaply and replace the shares you borrowed. That could offset a call option, whose value goes the other way. The balancing act is similar to the one Black and Scholes used to price an option.

    So traders place their bets by buying the options and hedge by adjusting how long or short they are the underlying stock. When the option expires, the traders cash out. If the stock price volatility was as they expected, and not as advertised, they win. It's really “statistical arbitrage,” one trader says. “It pays off 51% of the time.”

    Bond interest rates and prices also show volatility, opening another opportunity for arbitrage. Here the problem takes on an extra dimension. Bonds have a particular expiration date (a 1-year bond pays interest for a year). Options on them have a separate expiration date. Traders take the universe of bond-price options and interest-rate options (called swaptions because the interest rate is often packed in something called a swap) in the market and, from their prices, make a contour plot showing how the market thinks volatility will change as a function of the two expiration dates. The surfaces ought to be smooth—especially for points in the distant and unknown future. But sometimes they aren't. This summer, arbitrageurs bought swaptions on a bet that a dimple in the U.S. interest-rate volatility landscape would flatten out.

    Arbitrageurs also scour the market for subtle internal contradictions. Traders noted that the planned European Monetary Union should eventually make the yield on, say, Italian bonds and German bonds converge. Last summer, volatility surfaces for German and Italian bonds were similar, implying that the market expected the two economies to merge, but the bond yields themselves remained far apart, as if the market was ignoring the merger. Only one outcome was possible, so the arbitrageurs took positions that would make money either way—whether the yields converged or the volatility surfaces diverged.

    Feasting on crumbs

    When arbitrage does pay off, the profit can be minuscule—in some cases, a hundred dollar investment may bring a penny payoff. Factor in the overhead costs of executing a trade, and many aren't worth doing at all, except in enormous volume. So while arbitrageurs page through academic journals looking for technical insights, they're also engaged in a humbler enterprise: borrowing money. At the time of the crash, LTCM had parlayed $4 billion in actual cash into some $100 billion in spending money.

    The problem with borrowing lots of money is that at some point, the lender wants it back. And when the Russian crisis broke, banks wanted more and more collateral on the loans they'd made. Investors the world over were ditching their favorite stock or bond for the safety of German bonds and U.S. treasury bills. The shift pushed some arbitrage spreads the wrong way. Pricing gaps that were supposed to close yawned wider, and bets on the spread, instead of gaining in value, started to tank.

    Worsening matters for the bond trades, the arbitrageurs had hedged sloppily. In the swaption gamble that the dimple in the U.S. interest-rate volatility would disappear, traders would have survived if they'd hedged by shorting swaps. But instead, they went with U.S. treasury bonds, which usually move in lock-step with swaps and were slightly cheaper to trade. As usual, the combination was supposed to isolate volatility and insulate the traders from overall market movements.

    But in the panic, people bought U.S. bonds in record numbers and their price rocketed (see graph). To raise money to meet the banks' demands for collateral, hedge funds had to back out of the trades. Replacing the bonds they had shorted cost them hundreds of millions of dollars. They tried to sell off the swaptions, but during the crisis nobody was buying, and they went cheap. Many trades on stock volatility met similar fates. “It was wild craziness for 3 weeks,” Weinberger recalls, “In my entire history, I've never seen anything like that.”

    Some competitors watched LTCM's fire sale with a certain glee. “It was hypnotic,” one recalls, “then sickening.” Sickening because it started to happen to everyone. “It wasn't supposed to be so hard to sell,” one trader says. “What we missed was that other hedge funds were doing the same thing. That wasn't an input to anybody's model.” Traders usually think of the market as something external. But during the crash, “[arbitrageurs] looked around,” says RISK's Dunbar, “and realized they were the market.”

    Failure, QED?

    To many traders, the crash simply reflected tactical mistakes—not having enough cash on hand, not putting themselves and imitators into the models. They point out that today, market relationships are already returning to normal. “If the hedge funds had had enough money to hold on, many of these bets would have paid off,” one trader says. “They were trying to make too big a profit,” agrees Doyne Farmer, a former theoretical physicist at Los Alamos National Laboratory who started Prediction Company, a quantitative finance group in Santa Fe, New Mexico. “LTCM [borrowed] to a degree that would give most of us indigestion.”

    Some outsiders agree, saying the hedge funds simply overreached. “This was a bad use of models,” says Ron Dembo, a former Yale University mathematician who now runs Algorithmics Inc. a financial software company in Ontario, Canada. “The mathematics is beautiful, but it's based on very heroic assumptions.” Among them, he says, is that everything won't go wrong at once, as it did last summer. Dembo says hedge funds should have crash-tested their portfolios to make sure they could withstand such a global stampede for quality bonds.

    But others say the events of last fall show that arbitrage is a zero-sum game. “It's like a racetrack,” says Eugene Fama, an economist at the University of Chicago who downplays the math involved. “These are plain old bets.” Fama is famous for enthroning the notion of “efficient markets” (see sidebar). The basic idea is that market prices adjust intelligently and at lightning speeds. There is little room for true arbitrage. If you plotted the profit quantitative hedge funds make over many years, he says, “I bet it would be symmetric around zero.”

    Others think the arbitrage game is a winning strategy, but that the gains come at a cost. “My view is that it looks like they're selling insurance policies,” says Duke economist David Hsieh. Sometimes, for instance, people may sell risky bonds or options too cheaply because they would prefer to sleep well at night. Arbitrage traders “insure” these people by taking the securities off their hands. Maybe the hedge funds were unlucky, Hsieh says, and catastrophe struck early. MIT's Andrew Lo adds that by buying unpopular bonds or options in large quantity, the arbitrageurs provide “liquidity,” helping things move along and extracting a facilitator's surcharge.

    But the insurance idea doesn't sit well with some arbitrage traders, because it implies that, from time to time, they are sure to lose big. “My nightmare is that we're just selling insurance,” one says. He suspects that pricing anomalies are real, although it takes a lot of smarts and math to find them. “The set of market participants not paying attention is far deeper and vaster than the people who are doing arbitrage.” Most people push and pull on one part of the market, he says, and in doing so throw it out of whack with respect to another part. In this view, arbitrageurs are like the deckhands on a schooner who scurry about tightening all the knots—and taking a healthy profit.

    Whatever role arbitrage plays in the scheme of things, many suspect it will return from exile. Already LTCMappears to be back on track. Following a private-sector bailout this summer, LTCM made investors a healthy 11% by year's end. “We're not confirming that number, but it's accurate,” a LTCM spokesperson says. A spokesperson for D. E. Shaw & Co. would not give numbers but says that “the firm is extremely robust and growing.”

    David Shaw's 4.8 cent nickels got expensive last summer, but, as another trader put it, “I think one day there will be 4.8 cent nickels again, even 4.2 cent nickels. There are opportunities out there right now.”

  11. Can the Market Be Outwitted?

    1. David Kestenbaum

    Arbitrage traders say the marketplace is filled with tiny anomalies that can be exploited for hefty profits (see main text). It's easy to see why such financial goofs might exist. On an average day, the Chicago Board of Trade swims with unshaven traders in sneakers and colored jackets, shouting, writing numbers on their hands, sliding on a floor covered in printouts and cough drop wrappers, leaping down stairs into the trading pits and exchanging fistfuls of money with silent hand signals. The bar upstairs is filled by 3 p.m. As David Weinberger, a well-known arbitrageur, puts it, “Think of the market as a black box. There are all these inputs: hope, fear, greed, hemorrhoids, hangovers, rational thought, irrational thought. All that goes in one end and out comes a price.”

    Surprisingly, the human black box generally seems to spit out reasonable stock prices. New information—about corporate earnings, unemployment, or hurricanes threatening crops—is quickly incorporated into a stock's price. Most people who “beat the market” by picking underpriced stocks seem to be just lucky. The Wall Street Journal regularly pits top stock pickers against darts thrown at the stock pages. Stock pickers beat the darts about 60% of the time, but do no better than the Dow Jones Industrial average.

    But are the markets really so efficient as to preclude arbitrage, with its mathematical tools for teasing out the anomalies other people overlook? “Put it this way,” says University of Chicago economist Eugene Fama, “I wouldn't try it.” But plenty do, and claim to be making money. And a growing number of economists have started to think about the market as something other than a well-lubricated machine. “I would say that the finance profession is becoming much more open to the idea that markets are not efficient,” says Robert Shiller, an economist at Yale University.

    As evidence that “irrational” factors can influence the market, Shiller points to days when the market has lurched dramatically even though the world was quiet. “People try to concoct what the news was. But to me it's clear that the news was the market itself,” running on its own internal dynamics. The burgeoning field of behavioral economics has turned up market psychologies that can drive prices out of kilter. Some studies, for example, indicate that people tend to hold onto losing stocks, hoping they will rise, and eagerly sell-off winning stocks, regardless of what the news is.

    And there is some evidence that math can identify the resulting anomalies and thus see into the future, when they will correct themselves. Andrew Lo of the Massachusetts Institute of Technology and Craig MacKinlay of the University of Pennsylvania found, for instance, that two stocks from related companies often do a little dance in which a movement in one may presage a movement in the other. So if one drops, the other may be temporarily overpriced.

    “Stock prices are predictable to some degree,” Lo says, and with effort you can make money forecasting them. In fact, when Lo and MacKinlay started to look at more recent market data, from after 1988, they found a surprising change. “Some of the patterns we had observed have now been very thoroughly mined away. Most of the effects are gone.” The reason? Lo partially credits arbitrage traders at D. E. Shaw & Co. in New York. “David Shaw told me they were trading based in part on our studies,” Lo says. “I was very flattered.” But, he adds, “none the richer.”


    New Clues to How Proteins Link Up to Run the Cell

    1. Marcia Barinaga

    Recent work highlights the role of phosphate-bearing amino acids in bringing proteins together to control cellular activities

    Proteins are at the heart of the biochemical machinery that makes a cell run. But unlike the parts of, say, a car engine, which are permanently bolted together, the cell's molecular cogs and wheels are constantly assembling and disassembling. Before each task, they must locate and latch onto the right partners in the congested workspace of the cell. Recently, researchers have been learning how protein elements called binding domains help control this regulated coupling and uncoupling.

    The latest advance comes from cell biologist Kun Ping Lu and his team at Harvard Medical School, in the form of a new function for the so-called WW domain, a conserved amino acid sequence found in more than 100 proteins with diverse functions. Their report on page 1325 shows for the first time that the domain binds to other proteins only when certain of the serine amino acids in those targets carry a phosphate group. This suggests that in these cases, the domain controls a particularly important class of protein interactions: those that are turned on and off by signals within the cell.

    Cells regulate activities ranging from division to self-destruction by tagging proteins with phosphate molecules. And by adding the WW domain to the small group of protein-binding domains that home in on phosphoserine, the new finding suggests that at least some WW domain-containing proteins play a key role in controlling those cellular processes. Lu compares the WW domain to the SH2 domain, which enables proteins containing it to link up with proteins that contain phosphorylated tyrosine amino acids and is extremely important in controlling cell growth, among other things.

    Other researchers note, however, that the analogy may not be complete. “I'm not 100% convinced that all WW domains are going to bind phosphoserine in the same way that all SH2s bind phosphotyrosine,” says cell biologist Ray Deshaies of the California Institute of Technology (Caltech) in Pasadena. “But I am persuaded that at least a fraction of them do.”

    And that is enough to make the result intriguing. “It is a very fascinating paper,” says protein-signaling researcher Tony Pawson of Mount Sinai Hospital in Toronto, Ontario, and not just because it reveals a new function for the WW domain. It is also, Pawson adds, “part of an important, emerging story … that serine phosphorylation produces effects through controlling protein interactions.”

    Until about 10 years ago, cell biologists thought that phosphate groups, which are added to proteins by enzymes called kinases, exert their effects mainly by altering proteins' shapes in ways that influence their catalytic activity. But for proteins phosphorylated on the amino acid tyrosine, which include growth-factor receptors and other crucial components of cell signaling, a new view emerged in 1990. Pawson's team and that of Hidesaburo Hanafusa at Rockefeller University discovered that when phosphate is added to certain tyrosines, proteins with SH2 domains swoop in and bind to the phosphotyrosine-containing segment.

    That “changed the way we thought about tyrosine phosphorylation,” says Andrey Shaw of Washington University in St. Louis. It shifted the focus from its possible effects on catalysis to another role: enabling the phosphorylated protein to interact with proteins it would not have been attracted to before. But that thinking didn't carry over to proteins phosphorylated on serine, Shaw says: “We persisted in thinking about serine phosphorylation the same old way.”

    In 1993, researchers got the first glimpse of a parallel role for phosphoserine when Marc Montminy of the Salk Institute in La Jolla, California, Richard Goodman of the Vollum Institute in Portland, Oregon, and their colleagues showed that phosphorylating the gene-regulating protein CREB on serine triggers its activation by enabling it to bind to another protein, CBP.

    More evidence followed in 1996. It came from work on the so-called 14-3-3 proteins, which had started turning up bound to a variety of important regulatory proteins, such as the tyrosine kinase Raf, the cell death-promoting protein BAD, and Cdc25, a phosphate-removing enzyme and important controller of cell division. After Deborah Morrison at the National Cancer Institute's Frederick Cancer Research and Development Center reported that 14-3-3 proteins require phosphoserine in the targets, Shaw and his Washington University colleague Tony Muslin showed that phosphorylation of a particular serine-containing sequence in a variety of proteins sparks their binding to 14-3-3. “We thought that was a paradigm shift for serine phosphorylation,” says Shaw. The effect of the binding is still not clear, but in January, Paul Russell's team at the Scripps Research Institute in La Jolla, California, reported that it may keep the bound proteins from entering the cell's nucleus.

    In 1997, researchers found another role for phosphoserine-triggered protein association, in the regulated destruction of key proteins that are no longer needed. There were already hints that phosphorylation on serine may, in some cases, cause proteins to be destroyed. And work in 1996 by Steve Elledge of Baylor College of Medicine in Houston, Texas, and Kay Hofmann at the Swiss Institute for Experimental Cancer Research in Lausanne suggested that proteins called F-box proteins aid in this destruction.

    In 1997, two separate teams—Elledge, Wade Harper of Baylor, Michael Tyers of the Mount Sinai Hospital in Toronto, and their co-workers and a team led by Caltech's Deshaies—put the picture together. They showed that some F-box proteins contain protein-binding domains called WD40 domains or leucine-rich repeats that let them hitch up to proteins that are phosphorylated on serine or sometimes on another amino acid, threonine. The F-box proteins then tow their captives into an enzyme complex that tags them with a small protein called ubiquitin, which in turn shepherds them into the cell's protein-shredding machinery.

    View this table:

    Since that discovery, Elledge says, phosphoserine- or phosphothreonine-binding F-box proteins have turned out to be “a general piece of machinery” for the regulated protein destruction essential for cell activities, including inflammation, viral replication, and embryonic development.

    In the current work, the Lu team has added proteins containing the WW domain to the phosphoserine-binding club. Several teams had already shown that the domain binds to a proline-rich sequence in proteins, but some protein targets of the domain lack that sequence. Lu's team set out to search for another feature to which the WW domain might bind. One WW-containing protein, an enzyme called Pin1 that affects the timing of cell division, hooks up with many of the proteins that are phosphorylated at the onset of mitosis. The team speculated that the WW domain of Pin1, and perhaps other WW domains as well, might be recognizing phosphorylated sites on their targets.

    Their hunch proved correct. As a target, they used Cdc25, which helps trigger a cell's entry into mitosis and is phosphorylated on serine just before mitosis begins. They found that the WW domains of Pin1 and an enzyme called Nedd4, which plays a role in protein degradation, bind to Cdc25 only when it has its premitosis pattern of phosphoserines. It is not clear yet how many of the more than 100 proteins carrying WW domains bind to phosphoserine, however, since Lu's team has shown the binding for only two WW domains and many of the others bind to proline-rich sequences lacking phosphoserine.

    In spite of these unknowns, researchers are already getting inklings of how phosphoserine-binding proteins might interact in an intricate web to control such important activities as cell division. Proteins such as Cdc25 must be activated and inactivated in rapid succession, because their action is needed only in a narrow window of time in the cell cycle, notes Deshaies. And that timing now appears to be regulated by proteins that bind to Cdc25's phosphorylated serines.

    Russell's group at Scripps showed in January that Cdc25 can be kept in abeyance to delay mitosis—which is necessary, for example, when the cell's DNA has been damaged and must be repaired before the cell divides—if it is phosphorylated on a particular serine to which a 14-3-3 protein can bind. The 14-3-3 protein holds Cdc25 in the cytoplasm, preventing it from acting in the nucleus to begin mitosis. When the time arrives for mitosis to begin, Cdc25 loses that phosphate, allowing it to slip from 14-3-3's grip and enter the nucleus. It also gets a new set of phosphates that bind it to Pin1 as part of the steps in activating mitosis and also appear to mark it for later destruction by Nedd4. Phosphorylation in this case may work “like flipping an egg timer,” says Deshaies. “Once you turn the thing on, you only want to have it on for a certain period of time.”

    The fast pace of findings about how cells get proteins to make these brief alliances underscores that “this is the way that cells are organized,” Pawson observes. “Ten or 20 years ago we were used to thinking about enzymes, and everything was cascades of enzymes. But the more we look, the more we see that there is a fundamental organization that is more about modular interactions and protein localization. Enzymes are being controlled by their proximity to their substrates.” And WW domains and their phosphoserine targets appear to be crucial, although temporary, bolts in the cell's ever-changing biochemical engine.


    Behind the Headlines of Endostatin's Ups and Downs

    1. Jon Cohen

    A year-long effort to replicate remarkable tumor-shrinking experiments in the glare of publicity has produced confusing results

    Melinda Hollingshead was “furious” when she read a story in the 11 February Boston Globe that said scientists from her National Cancer Institute (NCI) lab had been able to “reproduce” the world's most celebrated—and questioned—cancer experiment in mice. This feat, the Globe reported, paved the way for a promising new drug called endostatin to move into human trials. Three months before, Hollingshead was just as annoyed—by stories saying that she and other researchers had failed to reproduce the original results.

    The remarkable results Hollingshead has been trying for more than a year to duplicate come from Judah Folkman and co-workers at Boston's Children Hospital. In mouse experiments, Folkman's group had found that endostatin hampered the growth of blood vessels—a process called angiogenesis—that feed tumors, making cancer disappear. The results touched off a media frenzy last spring when they were featured on the front page of The New York Times, only to be deflated in the fall when many media reported that researchers, including Hollingshead, couldn't repeat them.

    Now, the Globe had reported—correctly—that scientists from Hollingshead's lab, working side-by-side with researchers in Folkman's lab, had arrived at the astounding results that Folkman first published in the 27 November 1997 Nature. But to Hollingshead, both the positive and negative newspaper coverage has missed the complexity of the endostatin story. The earlier failures didn't mean that the strategy was hopeless, and she qualifies her latest success: Repeating an experiment, she says, means doing it independently. “I don't feel we really verified or repeated anything,” says Hollingshead. For cancer patients, she says, the coverage has been a brutal roller-coaster. “People are clinging to any little thread of hope they can catch hold to, and their fingers bleed from trying to climb.”

    Behind the headlines of angiogenesis inhibitors being a “miracle cure” one day and a “failure” the next lies a scientific saga that emphasizes how small differences in techniques, reagents, and assays can foil attempts by one lab to repeat the work of another. It shows that replication, a cornerstone of the scientific process, means different things to different people. And it also helps clarify why this media-driven frenzy about endostatin, fueled by a potent mix of medical and commercial promise, has been so confusing and frustrating to the public and scientists alike.

    To Folkman, the father of angiogenesis, the media spotlight has been more than frustrating. With all the attention, “it is not possible” to conduct science, says Folkman, noting that he rarely gives scientific presentations any more and has turned down 2300 interview requests since last May's front page New York Times article. NCI similarly has been besieged. “What is unusual is not the drug—it's our attempt to respond to the unbelievable interest in this drug,” says NCI director Richard Klausner.

    The making of a Folkman hero

    The “unbelievable interest” began with publication of the Nature paper. An accompanying “News and Views” by University of Toronto cancer researcher Robert Kerbel called the work “unprecedented” and “startling.” Kerbel cautioned that success with a cancer drug in mice often doesn't translate to humans, but said angiogenesis inhibitors “could herald a new era of cancer treatment.”

    Many newspapers, including The New York Times, wrote about the exciting findings in similarly hopeful but tempered tones. As the Nature paper detailed, the researchers first injected cancer cells into the flanks of healthy mice, which developed what's called a Lewis lung carcinoma, an especially difficult tumor to treat with chemotherapeutic drugs. After an injection with endostatin, Folkman's lab found that the tumors shrank, and they stopped the treatment. When the tumors returned, they again injected the mice with endostatin, and the tumors regressed. The drug continued to work after six cycles, indicating that no resistance had developed. More remarkable still, after disappearing for the sixth time, the tumors had not returned more than 5 months later, at which point they ended the experiment. These results were much more striking than had been obtained with other compounds that inhibit angiogenesis.

    Even Michael O'Reilly, the Children's Hospital oncologist who discovered endostatin and was last author of the Nature paper, had trouble believing his own results. “The first thing I thought when the tumors didn't come back is that it was just that one experiment, it was a fluke,” says O'Reilly. So before submitting his paper to Nature, he repeated the entire experiment and also tested the drug against two other types of mouse tumors. The data held up.

    NCI—which 3 months earlier had begun a project to develop endostatin with EntreMed, a biotech company in Rockville, Maryland, that had licensed the compound from Boston's Children Hospital—wanted to confirm the results before moving the drug into humans. “It's very unusual for us to sponsor a clinical trial where we haven't seen activity ourselves or the overwhelming preponderance of evidence hasn't shown that it's reproducible,” explains Edward Sausville, associate director of NCI's developmental therapeutic branch and Hollingshead's boss. Folkman, too, liked the idea of replicating the data. “You must have evidence in mice before you move into humans,” says Folkman. “That's basically the philosophy of the NCI, and we agreed with that.”

    Then on 3 May, the Times ran its glowing article, which quoted Nobelist James Watson comparing Folkman to Charles Darwin and saying “he is going to cure cancer in 2 years.” EntreMed's stock skyrocketed. Two hundred journalists a day requested interviews with Folkman. And enormous pressure began to build at NCI to move endostatin into the clinic.

    NCI's Hollingshead was “appalled” by the article, in part because she already had run into difficulties trying to repeat the experiment. Hollingshead had sent five scientists to Boston for 2 days in February 1998 to learn techniques from Folkman's lab, where they were welcomed with open arms, she says. “Injecting mice and taking care of mice sounds very simple, but it has many, many little pitfalls,” says Folkman. But despite the training, when Hollingshead injected the mice with endostatin in her lab, it didn't work.

    Then again, this batch of endostatin turned out not to work in Folkman's lab, either. The problem, the Folkman team decided, was that O'Reilly's first experiments used endostatin he had made in small amounts by stitching the gene for the protein into the bacterium Escherichia coli. NCI, in contrast, hired a company to make large amounts of E. coli derived endostatin, which they hoped to share with the research community at large. But something in the scale-up, apparently, hadruined endostatin's “activity.” Later batches of endostatin, which NCI made at its own plant in Frederick, Maryland, using both an E. coli expression system and another one that relied on mammalian cells, fared no better when Hollingshead tested them.

    A major stumbling block in trying to produce active endostatin was that the researchers had no test tube assay to assess the activity of a given batch of material. With many other angiogenesis inhibitors, researchers can assess activity by putting a solution of the compound on endothelial cells—which line blood vessels—and determine whether the cells stop growing or migrating around a dish. But the E. coli-derived endostatin was insoluble. “It's kind of like pouring sand or slime onto your cell surface,” explains Hollingshead.

    So O'Reilly made a small batch of material in his lab, tested it in mice for the ability to shrink tumors—the only assay he had—and sent it to Hollingshead by Federal Express, packed on dry ice. It, too, failed. O'Reilly wondered whether the shipping process was somehow responsible, so he suggested to Folkman that they mail some of their product to themselves. Don't waste the money, Folkman said—put some on dry ice in your car and drive around with it. “Sure enough … it didn't work, either,” O'Reilly says. “The protein is great if you don't try to ship it.”

    Bad news

    By the fall of 1998, several labs had tried to engineer endostatin or received it from Folkman, but no one could make it work. (Another promising angiogenesis inhibitor, angiostatin, was apparently proving equally fickle for Bristol-Myers, which announced earlier this month that it was dropping work with the compound because it was having trouble producing it reliably.) On 12 November, The Wall Street Journal broadcast this problem in a front page article that catalogued several of Folkman's previous findings that others supposedly had trouble replicating.

    Folkman, who has a stellar reputation for his ethics and scientific rigor, saw the article as “destructive” to him and people with cancer. “It's hard for the public and media to understand that when something doesn't work, it's not scientific manipulation, it's the way science is,” he says. “All of our papers for 30 years have been reproduced, but they all took time, and it usually was 1 to 2 years.” Ironically, Folkman notes, on the day the Journal article came out, Vikas Sukhatme of Beth Israel Deaconess Medical Center gave a talk at Harvard describing how he had suppressed the growth of tumors in mice with endostatin.

    Sukhatme took a different tack from O'Reilly, however. He manufactured his mouse endostatin in yeast, which yielded a soluble protein that he could evaluate in the various test tube assays before giving it to mice. He then tested the compound in “nude” mice that had a renal cell carcinoma, a different tumor and mouse system from the one Folkman's lab used. So this work extended, but did not replicate, the findings in the Nature paper.

    Bjorn Olsen of Harvard Medical School, who consults for EntreMed, now has positive results from yet another system: soluble human endostatin made in human kidney cells. Further confusing the picture, EntreMed hopes to conduct human trials with yet another variation: soluble human endostatin made in yeast. Olsen cannot compare his in vitro data to EntreMed's because they use different migration assays. And although NCI's Sausville says EntreMed's human endostatin “does not reach the same [activity] level” as mouse endostatin when tested in mice, Olsen cautions that nobody has yet compared mouse and human endostatin both made in yeast.

    Kerbel of the University of Toronto points out that all these variables make news reports about endostatin all the more frustrating, because very few experiments are directly comparable. “None of these factors are being discussed,” says Kerbel.

    Repeat performance

    While others explored these new tangents, a technician from Hollingshead's lab stood next to scientists in Folkman's lab from 18 to 26 January and conducted parallel experiments that aimed to reproduce the original results published in Nature. The endostatin worked, and NCI announced the accomplishment in a press release that simultaneously revealed plans to launch human studies.

    “If you push us to the wall, have we replicated the experiment from soup to nuts? We haven't,” says Sausville. “Have we put ourselves in the shoes of people who've done it? Yes. We agree there is a phenomenon to observe.” Hollingshead, who is now planning to repeat the experiment with Folkman's workers in her lab, agrees. “We saw effects they observed with their mice, their tumor, their equipment, with the one exception being that our personnel were doing the injections of endostatin,” she says. “All that really states is our people know how to inject endostatin.”

    That, in itself, may be a critical skill. Folkman says it took him years to perfect his techniques, which rely on factors such as the amount of material in the syringe, the gauge of the needle, where you inject the mice, and the temperature of the room that houses the animals. Douglas Hanahan, a cancer researcher at the University of California, San Francisco, confirms that he had much trouble with endostatin until he visited Folkman's lab and learned these subtleties. “Since then, we've had very reliable results,” Hanahan says.

    Hanahan, who co-chairs an NCI advisory subgroup on angiogenesis, says trying to repeat the experiment is important because it may help researchers figure out why endostatin is so fickle. “It would be a big shame if we moved into the clinic prematurely and the results were negative,” he says.

    NCI director Klausner says, however, that the accumulation of new data from EntreMed scientists, Olsen, and Sukhatme was enough to justify moving into clinical trials. “The decision to support going forward was made at a meeting about a month ago, and it was before I had seen the results from Boston,” says Klausner.

    NCI and EntreMed currently plan to begin small human trials of endostatin by the end of the year. Folkman and O'Reilly say they're excited to see how endostatin will work in the clinic. “Regardless of whether the media likes this stuff or doesn't, the real proof is going to be if this works or not in patients,” says O'Reilly. You can be sure those results will get a blast of publicity—and spin.


    China's Science Reforms: The View From the Top

    1. Dennis Normile,
    2. Xiong Lei*
    1. Xiong Lei writes for China Features in Beijing.

    Zhu Lilan, a polymer chemist, discusses China's efforts to improve science and harness it to sustained economic growth

    BEIJING—Zhu Lilan has already earned at least a footnote in history for being China's very first minister of science and technology. But the former polymer chemist isn't content with that achievement. “Only through making a real contribution can you justify your position,” she says. In fact, hundreds of thousands of Chinese scientists are being asked to do exactly that—justify their positions—as part of a massive, long-term government reform aimed at harnessing new technologies for economic growth. “Before, the state allocated financial resources according to how many people you had under you,” she says. “Now it will be according to what you do and how well you do it.”

    Zhu, 63, is a product of the old research system, modeled after that of the Soviet Union, where many senior Chinese scientists were trained. After studying polymer chemistry at I. I. Mechnikov Odessa State University in Ukraine, Zhu joined the Chinese Academy of Sciences' (CAS's) Institute of Chemistry in Beijing in 1961 and rose through the ranks to become director in 1985. The next year she moved to the State Science and Technology Commission (SSTC) as vice minister, the first woman to hold that post. There she oversaw segments of a new program, called the 863 project, to develop world-class high technology in 15 areas.

    Wang Dahang, a prominent optical scientist who co-authored a proposal to the government that led to the 863 project, says Zhu “played an active role in making the project a success. She set up a system of running the project through expert committees, which proved very effective.” Last year the science commission was turned into the Ministry of Science and Technology (MOST), and Zhu succeeded longtime SSTC Minister Song Jian at the helm (Science, 27 March 1998, p. 2034).

    Her promotion comes at a crucial time for Chinese science. The government has made research one of the pillars of economic progress and pledged to increase R&D spending substantially. CAS, which operates the country's leading research laboratories, is being given $650 million over 3 years—an amount equal to the academy's current operating budget—for a “Knowledge Innovation Program” (Science, 8 January, p. 150). The goal is hardly controversial: increased support for fewer, top-level researchers and greater reliance on peer review to raise the quality of the basic science being supported. But the process—cutting overall staffing of 68,000 by more than half and introducing wide-ranging management reforms—is expected to be a painful one.

    China's system of support for more applied technology is also undergoing a painful shake-up. MOST itself has just launched the 5-year, $300 million 973 Program, so named because it was approved in March 1997 (Science, 18 December 1998, p. 2171). Its generous grants to peer-reviewed proposals in six broad areas deemed economically or socially important—life and information sciences, agriculture, natural resources and the environment, energy, and new materials—come at the same time hundreds of technology labs are losing their guaranteed state funding.

    Although most scientists agree with the need to make Chinese R&D more efficient, some criticize the harsh medicine being administered. And others, including Wang, argue that the government should pay more attention to basic science and understand that its findings don't necessarily translate into immediate economic payoffs. There is also a push to broaden the decision-making process and bring in more working scientists to set R&D priorities.

    Zhu has earned a reputation as a no-nonsense administrator who isn't afraid of tackling such issues. “I'm known as a harsh old lady,” Zhu said at her elevation to the SSTC post in 1986, “although I'm not mean.” However, subordinates say she's quick to cut people short if she feels that their presentations are long-winded. And they say that her trademark response to policy debates—“I'll be responsible for it”—serves as a not-so-subtle reminder that she enjoys exercising her power and that she expects colleagues to meet her high standards.

    Zhu discussed these and other matters shaping Chinese science in a 2-hour interview on 29 January in her office in Beijing. An edited transcript follows.

    Q: CAS has undertaken a major reform that involves cuts in the number of institutes and staff. How did that reform originate?

    A. Reform has been in progress since 1985, when the government decided that scientific research and development must be oriented toward the economic development of the country and that economic development must rely on the progress of science and technology. At that time, the science and technology establishment was not rational. Too many scientists were involved in pure research and not enough were engaged in solving problems of national economic development. As a result, [it was decided that] there must be a movement of people within the science and technology sector.

    The objective is not to cut the number of people. Rather, we want scientists and researchers to find where they can make their best contribution. In basic research, we wish to see fewer but the very best scientists, with more people moving toward the marketplace and serving economic development. The goal is to enable [business] enterprises to become the main source of technological innovation, while the research institutions and universities become the main source of knowledge innovation. Some intermediary organizations or agencies will emerge to facilitate the commercialization of research achievements.

    On a practical level, it is up to [each institute] to reform itself. We give them a goal and allow individual institutes to reach the goal in their own ways. [The institutes] approach us, not for permission but to get support and help. For example, the academy has been granted a large amount of money [for the Knowledge Innovation Program]. MOST, as a government agency, has set the final goal and will monitor the use of that money. We will say, “The state has given you scores of billions of yuan [8.3 yuan equal $1]. How many scientific achievements have you made? How many technological innovations have you made? If you have filed no patents and published no papers in Science, you will be held responsible!

    Q: Will similar reforms be implemented at universities and other national institutes?

    A. Yes. This stage of reform is to be completed within 3 years. For example, 10 former ministries have been made bureaus or departments within the State Economic and Trade Commission. The reform of 242 mostly applied research institutes under these former ministries must be completed by June. The aim is not to reduce the number of research institutes or the number of scientists and researchers, but to run these research institutes like enterprises, to industrialize science and technology. These institutions are still entitled to apply for financial assistance for their projects and to participate in our national programs, including the 973 Program. But [winners will be chosen] through competition.

    Q: Traditionally it has been difficult for researchers to move from one institution to another. Will the reforms allow for greater mobility?

    A. We wholly encourage them to do so. But, really, not many are willing to move.

    Q: How were the priority areas selected for the 973 Program?

    A. There were three steps. First, we in MOST, the National Natural Science Foundation (NNSF), and the whole science community reached consensus on the guidelines or principles, the criteria for selecting projects, and the selection procedure. Then we established an expert committee headed by [former CAS president] Zhou Guangzhao and including other prominent scientists from various disciplines. The priority areas were first reviewed at a lower level within MOST, then by the expert committee, before MOST made the final decision.

    Q: Last year China adopted new rules on the export of genetic materials. There is also concern from abroad about censorship of the Internet and control over e-mail. Do you think these policies will hamper attempts to improve scientific links between China and the rest of the world?

    A. A lot of scientific advances, like the Internet, are like double-edged swords. They have both positive and negative aspects. If you fail to handle the negative aspect well, the positive impact will be limited. Our general principle is to promote the healthy development [of such tools]. The challenge is how we make sure that our regulations benefit such development. Many questions concerning the Internet are being debated and discussed throughout the world and also in China.

    There are a lot of high school and even primary school pupils interested in using the Internet. On the whole, this is a good thing. But there is also some inappropriate material on the Net, erotic pictures, for example. So we cannot let them have free access without checks. As for genetic materials, on the whole we intend to facilitate collaboration, but we have to ensure mutual benefit under established rules. These rules should enhance the collaboration, not stop it.

    Q: Who sets policy for CAS and NNSF?

    A. Policy is a form of guidance rather than an order. We solicit suggestions from all those concerned and we reach a consensus. On that basis, everyone in the field has to make efforts to achieve the goals established by the policy.

    Take fundamental research. Scientists would like greater freedom to pursue their own interests. But we are a developing country with limited financial resources. We respect the individual interests of scientists, but we try to persuade them to match their individual interests with national needs and challenges. Some scientists do not totally agree with that, but we have to find ways to combine the two.

    Q: Why did the State Science and Technology Commission become the Ministry of Science and Technology?

    A. There's no essential difference between the ministry and the commission. To us the most important point is the [national] strategy of reinvigorating China through science and education. This strategy makes me both happy and anxious: happy because the leaders of our society are attaching more importance to science and education, and anxious because we must make sure that we can help advance social progress and economic development.

    Q: What is the relationship between the ministry, CAS, and the NNSF? And what is the role of the State Council and the new Leading Group on Science and Education?

    A. The roles of the three organizations you mentioned are fundamentally different. Our ministry is responsible for formulating national policies on science and technology and overseeing national programs or initiatives. We have no research institutes whatsoever. CAS is a research entity [with] about 100 research institutes that carry out work in accordance with national policy. The NNSF was set up [in 1985] to develop China's basic research, and the researchers it supports have more freedom [than in projects supported by individual ministries].

    The Leading Group on Science and Education [of which Zhu and Lu Yongxiang, president of CAS, are members] is part of the State Council and is headed by Premier Zhu Rongji himself. It coordinates all efforts related to science and technology carried out by the different ministries and commissions.

    Q: What challenges face developing countries seeking to move into the front ranks of science?

    A. We have entered a new era in which scientific and technological development is vast and extremely quick, and its impact on economic development is also powerful. This presents a new opportunity for latecomers. To give one practical example, in developed countries, thousands of miles of copper cables have been laid for communication lines. But the latecomers, the newly developing countries, can go in one step to fiber-optic cables.

    Q: What do you hope to accomplish as the first minister of MOST?

    A. We aim to increase the contribution of science and technology to the national economy by another 10 percentage points. We estimate that advances in science and technology currently account for 40% of the annual growth in agricultural production and for 30% of industrial growth. The idea is to rely on scientific progress while proceeding from market demands, in accord with a Chinese saying that roughly translates as reaching for the sky while keeping your feet on the ground.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article