News this Week

Science  01 Feb 2002:
Vol. 295, Issue 5556, pp. 469

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Sensor Failure in 1997 Test Sparks New Controversy

    1. Andrew Lawler

    BOSTON— A government investigation has found that much of the data from a 1997 ballistic missile test flight that the U.S. Department of Defense labeled a solid success may be useless. Sources familiar with two studies by the General Accounting Office (GAO) say they highlight software and sensor problems on the 1997 flight over the Pacific. The wording of the reports is still the subject of fierce internal debate, but their upcoming release is sure to provide more ammunition for skeptics of the controversial ballistic missile defense program.

    In the 24 June 1997 test, a vehicle carrying an infrared sensor was launched from Meck Island to determine whether it could discriminate among nine objects launched nearly simultaneously from the California coast. That capability is a first step on the path to identifying and shooting down enemy warheads while ignoring decoys. Shortly after, program manager Joseph Cosumano declared that “all aspects” of the $100 million test “were highly successful.”

    But that characterization has been hotly disputed for years. A former employee is suing contractor TRW Inc., claiming it falsified data and disregarded science and engineering standards, and a physicist at the Massachusetts Institute of Technology (MIT) argues that the tests' flaws were covered up. The government is now spending $8.3 billion a year on the program, with a boost expected in next week's presidential 2003 budget proposal.

    The debate prompted two legislative requests—from Representative Howard Berman (D-CA) and Senator Charles Grassley (R-IA) and from Representative Ed Markey (D-MA)—for studies by GAO, Congress's investigative arm. The reports uncovered a problem with the sensors that was never mentioned in unclassified materials, government officials say. GAO's Robert Levin confirmed that the sensor's performance lies at the heart of the reports, but he declined further comment. “There is a big fight over how to spin it,” says Michael Levi, an analyst at the Federation of American Scientists in Washington, D.C. Defense officials last week admitted that the 1997 test was not as “robust” as first thought, adding that the vehicle and sensor in question are no longer part of the missile defense program.


    U.S. defense officials say they have a new technical strategy for spotting enemy missiles.


    A key component of the 1997 test was the focal plane array used to detect infrared wavelengths from the targets. The resulting data were key to determining whether an object was a warhead or a decoy. The sensor was calibrated to function best at 10 kelvin, but during the test the sensor temperature dropped only to an estimated 13 K. That discrepancy would have affected its accuracy. Realizing the problem in midflight, controllers tried to recalibrate the instrument by pointing it at Arcturus, a star with a known strong signal, but they were not sure of the sensor's exact temperature. As a result, “there was a tremendous amount of noise” in the resulting data, says one government engineer familiar with the test results. “It was guesswork to filter it out; there's a lot of uncertainty in that data.”

    Theodore Postol, an MIT physicist and longtime missile defense critic who has studied the data, estimates that the signal-to-noise ratio was 25 to 40 times higher than expected. In a 14 January letter to MIT president Charles Vest, Postol says the high ratio “renders essentially all of the data in the experiment useless or open to question.”

    The sensor had about 80 seconds to gather data on the objects. The Department of Defense initially claimed it had received 55 seconds of data, changing it to 18 seconds after an analysis by an independent panel. But even those 18 seconds of data are questionable, according to Postol and the government engineer. The engineer says that an informal but classified analysis in 2000 by researchers at MIT's Lincoln Laboratory in Cambridge, Massachusetts, uncovered the problem. Lab spokesperson Roger Sudbury declined to discuss the matter.

    Keith Englander, deputy for system engineering and integration at the Pentagon's newly redesignated Missile Defense Agency, says that discriminating between decoys and warheads “is hard,” that the methods used in the test to analyze that discrimination “were fragile,” and that the sensors “did not have as robust a discrimination method” as the Raytheon vehicle that was eventually selected over the TRW and Boeing design. However, he declined to discuss technical details.

    The performance of the sensor is the latest in a long line of complaints about the way the test was prepared, conducted, and analyzed. Nira Schwartz, an engineer formerly with TRW, says that the company used faulty algorithms in the design of the test and then covered up the sensor's failure to identify the targets. A panel of academics, including several from Lincoln Lab, concluded in a 1998 study that some algorithm designs were “questionable” but that the overall experiment was “basically sound.” Schwartz was fired in 1996, and she is suing TRW and Boeing, the prime contractor, for violations of the False Claims Act. Lawyers for TRW and Boeing say the charges are spurious.

    Meanwhile, at the prompting of Postol, Vest has initiated an inquiry into whether Lincoln Lab researchers provided sufficiently independent analysis of the test flight program in that 1998 study. However, Postol last month complained to the MIT trustees that Vest was dragging his heels. MIT provost Bob Brown declined to discuss details of the inquiry, which would precede a formal investigation.


    'Borrowed Immunity' May Save Future Victims

    1. Martin Enserink

    Government investigators want to have something more than antibiotics on hand should anthrax terrorists strike again. Of the 11 bioterrorism victims who came down with inhalation anthrax last fall, five died despite the powerful antibiotics they were given. Now investigators at the Centers for Disease Control and Prevention (CDC) and other federal agencies are seeking permission to treat future severe cases with an experimental therapy designed to confer instant immunity against the bacterium's deadly toxin: blood plasma from military personnel vaccinated against anthrax.

    The proposal has passed CDC's internal ethics board and could be sent for approval as early as this week to the Food and Drug Administration (FDA), says Bradley Perkins, chief of CDC's Meningitis and Special Pathogens Branch. One question to be explored is whether the plasma batch CDC proposes to use complies with FDA regulations. Meanwhile, CDC, the National Institutes of Health (NIH), and the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) also plan to conduct a series of animal experiments to determine whether the therapy will work and how much plasma would be needed.

    Arms supply.

    Plasma from vaccinated volunteers may provide an experimental anthrax treatment.


    Antibiotics can fail, researchers say, because Bacillus anthracis churns out a toxin that continues to wreak havoc even after the bacteria have been killed. Scientists hope that the variety of antibodies in the injected plasma from vaccinated people will be able to eliminate the toxin. But in the long run, CDC plans to use antibodies, or immunoglobulin, that have been purified from the plasma.

    The approach, called passive immunotherapy, has a long history: Plasma from vaccinated horses was the only available anthrax treatment in the preantibiotic era, and it's still used in Russia and China. Unfortunately, says USAMRIID anthrax researcher Arthur Friedlander, none of the reports about its efficacy in humans meets modern scientific standards. In animal studies, the strategy has been shown to work only when antibodies were given prior to anthrax exposure. Given this paucity of data, the proposed treatment would be given only as an adjunct to antibiotics, says Perkins, and only to failing patients who didn't improve on antibiotics alone.

    The current plasma supply, which was collected from military personnel before the attacks, is far from ideal. Only 135 plasma units of about 600 milliliters each are available; part of that will be used for animal tests, and the remainder would suffice for a few dozen patients at most, says Perkins. And because the plasma was collected with scientific experiments in mind, not for use in humans, collection and storage procedures may raise eyebrows at the FDA. Some of the vaccinees were also enrolled in vaccine trials using live agents such as Venezuelan equine encephalitis virus and tularemia, making them less desirable donors. But a draft of the treatment protocol argues that this poses only a small risk, because the pathogens were attenuated and the trials took place at least 2 months before plasma was collected.

    The investigators also want to collect a second, larger batch of plasma from vaccinated volunteers for use in both treatment and animal studies. But it would still be a fairly modest amount—perhaps three times what's available now. “Most [investigators] are not willing to stockpile this material in any serious quantity without much better data about its efficacy in animals,” says Perkins. Even if the product is shown to work in animals and becomes an accepted treatment, he adds, “it would be an interim solution at best.”

    Researchers would prefer a treatment that does not rely on volunteers. That's why many anthrax researchers are following the work of a team led by Brent Iverson and George Georgiou of the University of Texas, Austin, who claim to have created so-called monoclonal antibodies that cling to the anthrax toxin much better than those produced by the human body. Georgiou and Iverson refuse to discuss the findings, which were presented at a meeting last summer, before publication. But Stephen Leppla, an anthrax researcher at NIH familiar with the results, says the antibodies have a 40-fold improved affinity, and they kept alive rats injected with the toxin. The next logical step, he says, would be to see whether the product can also save animals with a real anthrax infection.

    The results should demonstrate whether one type of antibody, even if it's very good, can work as well as the broad mix produced by the immune system, says Friedlander. But, he says, the production of improved antibodies “is a good idea that certainly deserves to be evaluated.”


    'Hot' Legacy Raises Alarm in the Caucasus

    1. Richard Stone

    VIENNA— Can a crack international team secure two tremendously radioactive objects in the mountains of a strife-torn former Soviet republic before they fall into the hands of nuclear terrorists? The question may sound like a trailer for a James Bond movie, but it's for real. Science has learned that the International Atomic Energy Agency (IAEA) early this week dispatched a team to a remote area near Georgia's breakaway Abkhazia region to help local officials grab the dangerous, portable devices.

    Georgia is a hot spot for illicit trafficking of nuclear materials (Science, 1 June 2001, p. 1632). Although Western governments have worked hard to help Russia and other countries secure their fissile material, the 11 September attacks have heightened fears of terrorists using other radioactive materials, from discarded medical isotopes to uranium mining tailings, to make “dirty bombs” that could spread radioactivity over large areas. Feeding those fears is the fact that the materials are usually less secure than weapons-grade nuclear caches and often have been abandoned by former owners.

    The crisis began with a fax on Christmas Eve from Georgian authorities. Three men gathering wood near Lja on 2 December 2001 had found two containers that appeared to have melted the nearby snow. Lugging the containers back to their campsite for warmth, the men soon became dizzy and nauseous and started vomiting. Within a week, radiation burns began to develop on their backs. On 4 January 2002, IAEA dispatched three investigators to Tbilisi, but heavy snows and rough terrain prevented them from reaching the objects.

    Too hot to handle.

    A Soviet schematic shows that the Georgian radioactive canisters (top) were at the heart of an unusual thermogenerator.


    This is not the first time such containers have been found. In 1998, not far from Lja, a fisherman found one in a riverbed. Physicists in Tbilisi later discovered that it was packed with strontium-90 emitting a whopping 40,000 curies of radiation, equivalent to the radiation from strontium-90 released during the 1986 Chornobyl explosion and fire. “My shock was so great when I was informed of this,” says Abel Julio González, director of IAEA's division of radiation and waste safety. “I was convinced they had made a mistake.” But an IAEA team confirmed the readings and whisked the object —along with a second one found soon after—to a guarded location in Tbilisi.

    Western officials initially did not know what the containers were used for nor how many had been built. A request for information from Russia yielded sympathy but few answers, says an IAEA official, as Russian authorities insisted they were not liable for nuclear materials found in other former Soviet republics.

    Slowly, however, González and his team began to piece together the puzzle. They obtained a schematic of a device that used the mystery containers to generate electricity from the strontium's heat, possibly to power remote radio transmitters. The units recovered so far are encased in a titanium-based ceramic. As the beta particles streaming from the strontium-90 slam into the metal shielding, part of the energy is converted into x-rays and part into heat. Soviet labs apparently produced several hundred of the generators, including some with radioactivity levels as high as 100,000 curies. None of these high-powered models have yet turned up, and only a handful of the 40,000-curie devices have been recovered in covert operations in four countries: Georgia, Belarus, Estonia, and Tajikistan.

    Radiation injuries from orphaned sources are “a very real problem,” says George Vargo of the U.S. Pacific Northwest National Laboratory in Richland, Washington. But the Georgian men being treated for severe x-ray burns are the first confirmed victims of the Soviet thermogenerators.

    The IAEA team members had planned to wait until this month to assist in recovering the containers, but they decided to move more quickly after an urgent appeal last week from the Georgian government. They arrived in Tbilisi on 27 January. Officials from IAEA and Georgia, France, Russia, and the United States are expected to meet in Tbilisi on 4 February to review the recovery effort and discuss the lingering threat of other orphan sources. Although much of the strontium-90 will probably be stored as radioactive waste, the agency is also mulling a suggestion to sell some of it to hospitals as a source of the short-lived daughter isotope yttrium-90, an experimental treatment for cancer and arthritis.


    Primate Parthenotes Yield Stem Cells

    1. Constance Holden

    A reproductive quirk of some reptiles, insects, and other species may help stem cell researchers sidestep ethical debates over the use of human embryos. Researchers at Advanced Cell Technology (ACT) in Worcester, Massachusetts, report on page 819 that they have isolated the first stem cell lines from primate parthenotes, embryos grown from unfertilized eggs that, in mammals, are not capable of developing into viable fetuses.

    In October, Yan-Ling Feng and Jerry Hall of the Institute for Reproductive Medicine and Genetics in Los Angeles showed they could derive stem cells, which later developed into neurons, from mouse parthenotes. Then in November, ACT scientists grabbed headlines with the news that they had created human parthenotes— although the cell clusters died before reaching the blastocyst stage, well before viable stem cell lines could be extracted.

    Now Jose Cibelli and colleagues at ACTreport that they have been able to culture a variety of cell types, representing all three germ layers, from stem cells taken from monkey parthenotes. To create the parthenotes, the scientists treated 28 macaque ova with chemicals that prevent eggs from ejecting half their chromosomes—as they do when fertilized—and instead spur the eggs to begin dividing. Four of the 28 developed into blastocysts; the team was able to establish a stable stem cell line from the inner cell mass of one of them. From these stem cells, the researchers developed a considerable variety of cells, including dopamine-producing neurons and spontaneously beating cells resembling heart cells.

    Other teams have teased primate ova into blastocysts parthenogenically, but this is the first report that such blastocysts can yield stem cells. The implication of this work, says Don Wolf of the Oregon Regional Primate Research Center in Beaverton, who has generated monkey parthenotes in his lab, is that “[embryonic stem] cells can be derived from human parthenotes.”

    Virgin division.

    Nerve (pink) and pigment (black) cells grow from stem cells plucked from a parthenote.


    Not everyone agrees. Developmental biologist Davor Solter of the Max Planck Institute for Immunobiology in Freiburg, Germany, says that even though the researchers have succeeded in generating normal-looking stem cells from monkey parthenotes, this reveals little about whether the same can be done in humans: “Every single mammal has its own quirks. If you want to figure out how to make [parthenotes] in humans, you have to make them in humans.”

    Ethically, however, the option is attractive. As in other primates, human parthenotes cannot develop to full-term babies. If researchers can find a reliable way to derive stem cells from human parthenotes, they could avoid therapeutic cloning, in which a potentially viable embryo is created as a source of stem cells and then destroyed. Bioethicist Glenn McGee of the University of Pennsylvania in Philadelphia predicts that this won't quell all objections, because people uneasy about stem cell research won't be very comfortable with “the idea of producing a creature whose status as a life-form is entirely ambiguous.” Nonetheless, he observes that “the arguments against using embryos in research would seem to suggest that the parthenote is the ideal subject to replace the embryo.”

    Wolf says parthenogenesis would actually be simpler than therapeutic cloning for producing genetically compatible material for a patient—at least one with oocytes. “Of course, with this approach,” he adds, “you could not produce your own stem cells unless you could also provide your own eggs. Sorry, guys.”


    Molecular Motors Move in Mysterious Ways

    1. Jennifer Couzin

    Behind a beating heart, fingers running fluidly across a piano, or a stomach cell shuffling nutrients to its neighbor are hundreds of motor proteins that make such motion possible. Yet even as biologists have been classifying these proteins and delineating their structures, they have long debated one critical question: How moveth the motors themselves?

    Now, a trio of biologists delivers another in a series of jolts to this field. On page 844, Wei Hua and colleagues Johnson Chung and Jeff Gelles of Brandeis University in Waltham, Massachusetts, dispute the widely accepted mechanism of motion for kinesin, a well-studied member of the motor protein class. The three propose that kinesin, responsible for propelling cellular components and proteins along stiff fibers called microtubules, crawls like an inchworm rather than taking even, symmetrical steps. The theory is striking for, among other issues, its pronouncement that kinesin's two structurally identical “heads,” clusters of amino acids that do most of the enzyme's work, perform vastly different tasks.

    Already the work is prompting sharp words and reflection from those in the motor protein field. “Many of our beliefs and the models we've been proposing may turn out to be spectacularly wrong,” says Steven Block, a biophysicist at Stanford University in Palo Alto, California, referring not only to the Hua paper but also to a parallel upheaval in the study of myosins, another major group of motor proteins.

    Ironically, Gelles's group set out to prove the dominant theory of kinesin movement: that the enzyme's two heads alternately and symmetrically step over each other along the microtubule, a motion known as “symmetric hand-over-hand.” To attempt to confirm hand-over-hand, the researchers made several adjustments that enabled them to follow individual kinesin molecules, which are normally just 70 nanometers long. First, they anchored kinesin to a glass plate but let the microtubule, a far larger structure to which kinesin is attached, move freely. Second, they slowed the motion of kinesin by restricting its access to energy-providing adenosine triphosphate (ATP). Finally, to amplify minuscule microtubule movements, the group made the connection between kinesin and the plate more rigid.

    No spinning around.

    A microtubule didn't rotate as expected under kinesin's power.


    When Gelles and colleagues let the motor run, they did not witness the scene they'd expected. Symmetric hand-over-hand demands that each head make a 180° rotation for every 8-nanometer step it takes, says Wei Hua, now at Yale University. But the scientists, whose technology was capable of detecting rotation above 31°, found none at all. The group proposes a new “catch-up” model for kinesin movement: One head pushes forward 8 nanometers, stops, and drags the second along toward it.

    Gelles's team was forced to make a secondary, decidedly unorthodox proposal to make its model fit with a basic rule of kinesin biology: that one ATP energy molecule is burned for every 8-nanometer step. In hand-over-hand, the heads presumably alternate burning, or hydrolyzing, ATP. Here, both heads forge the same 8-nanometer distance each time; therefore, only one head could be hydrolyzing the ATP molecule. The other head, the scientists predict, is chemically inactive while kinesin moves.

    That two identical structures could wind up with such divergent jobs is a hotly disputed point of the paper. “It's hard to envision how you could relegate [different] functions to the two heads, given that they're produced by the same gene,” says Sharyn Endow, a molecular geneticist at Duke University in Durham, North Carolina. Endow and others point to previous work they say shows that both heads hydrolyze ATP. The authors of the new study stand by their story but admit they're as befuddled as everyone else. “Maybe there's a reason for [the presence of] two heads that we don't know,” says Hua.

    Meanwhile, other researchers, such as Joe Howard, director of the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, favor yet a third model. In “asymmetric hand-over-hand,” the kinesin heads step over each other but rotate little. The mystery might be solved if researchers can overcome a technical challenge universal to kinesin motion studies: the difficulty distinguishing between the two tiny heads. Scientists are experimenting with special dyes to do just that.

    Myosin researchers can sympathize with their kinesin brethren. Recent work on these motors, which control muscles and transport various proteins, shows that two family members don't move as believed. Myosin VI, whose function remains a puzzle, apparently edges backward and takes far larger steps than its structure suggests is possible. And myosin V has been found to stay stuck to its filament during motion rather than lifting off periodically.

    Motor molecules are “capable of some pretty surprising things that we might not have predicted,” says Richard Cheney, a cell biologist at the University of North Carolina, Chapel Hill. And they're taking scientists along for the roller-coaster ride.


    Reforms Spark More Jobs--and Protests

    1. Xavier Bosch*
    1. Xavier Bosch is a science writer in Barcelona.

    BARCELONA— Spain's government sees it as a cure for cronyism. The universities see it as an infringement on their autonomy. The bone of contention: a new law governing hiring practices that has triggered a mad rush to fill academic posts and has sparked a bitter row between the universities and the education ministry that funds them.

    Last December, Spain's parliament passed government-sponsored legislation that subjects candidates for academic posts to peer review by national panels before they can apply for a job. In the weeks leading up to the law's passage, university rectors assailed the legislation, arguing, among other issues, that it would erode the autonomy of Spain's public universities, impeding their ability to hire top talent. At one point, the rectors appeared to be winning the public relations battle: On 1 December 2001, more than 100,000 people took to the streets to protest the law. But they lost the war when the bill became law a few weeks later.

    Now the rectors are under fire from their own rank and file. In a 3-week period last fall, Spain's 48 public universities advertised some 4600 new positions, about twice the number posted during an entire year. Because the jobs were advertised before the new law took effect on 13 January, the slots will be filled under the old rules, in which five-member appointment boards select candidates by majority vote. But hiring so many people this year will have “hugely negative effects” by sharply limiting opportunities for young researchers in coming years, predicts inorganic chemist José Vicente of the University of Murcia.

    Not reform-minded.

    A recent protest of the new university law sent thousands into the streets of Madrid.


    The government's reforms are designed to reduce the universities' influence over the appointments board. Two of the five board members come from the university, so only one other member must be persuaded for the university to land its favored candidate. Thus the deciding vote often is “largely influenced by favoritism and mutual self-interest,” contends astrophysicist Antonio Ferriz-Mas of the University of Vigo. An education ministry survey appears to offer some support for that claim: Professorial posts handed out under the old system went to internal or local candidates over 90% of the time.

    According to the law, a new agency will first review the qualifications of aspiring applicants to sort the wheat from the chaff. Those who pass muster can present themselves to national boards of experts, who would recommend the best applicants to the university for final selection. The law will ensure that only capable individuals land professorial posts, says physicist Luis Rull-Fernández of the University of Sevilla.

    However, the Spanish Council of Rectors (CRUE) claimed in a statement that the law erodes university autonomy, which it calls a “fundamental right” under Spain's constitution. CRUE president Saturnino de La Plaza says his organization may mount a court challenge. He defends last fall's mass job postings, which he says aim primarily to give permanent posts to researchers who have toiled for years on temporary contracts.

    The education ministry dismisses CRUE's explanation. It said that the “hasty and massive” posting is an attempt to “avoid a more open, competitive, and transparent recruitment to ensure the quality of research in universities.” Some see darker motivations for the rectors' opposition to the new law: As Rull-Fernández points out, it requires all rectors to step down in 6 months, paving the way for a new generation of academic leaders.


    When in Doubt, Mice Mate Rather Than Hate

    1. Mary Beckman*
    1. Mary Beckman is a writer based in southeast Idaho.

    A new genetically modified mouse abides by the motto of the psychedelic age: Make Love, Not War. A male that can't sniff out the sex of its partner will, to put it delicately, try to partner with it rather than attack it. The same mutation could never lead to a peaceable kingdom among humans, however, because the part of the brain responsible for the mice's amorous behavior is as vestigial in humans as the appendix.

    The research, led by Harvard molecular neuroscientist Catherine Dulac and published online by Science this week (, suggests that the default social interaction for mice is to mate. Only a scent-based cue from another male inhibits a male's urge to mate and spurs him to fight. The number of genes that control this behavior is precisely one; it encodes the protein TRP2 that sits on the surface of certain olfactory nerves that detect pheromones.

    Calling the work “superlative,” neurobiologist Emily Liman of the University of Southern California (USC) in Los Angeles says, “it opens the way for genetic analysis of a plethora of behaviors,” including sexual maturation, gender recognition, and spontaneous abortions in mice, all of which are influenced by pheromones. It also debunks the notion that mating has to be evoked by a pheromone that tells a male it's in the presence of a female.

    Mating games.

    Male mice lacking TRP2 don't know they should be fighting with each other.


    Mice have two olfactory systems. Airborne smells trigger the main olfactory epithelium that sends messages to the primary olfactory cortex. Pheromones—personal identification molecules that emanate from both males and females—stimulate a batch of 400 nerve cells in the nose-based vomeronasal organ (VNO). The VNO sends signals to the hypothalamus, a brain region involved in reproduction, defense, and eating. TRP2 resides only in these VNO cells.

    To find out what TRP2 contributes to pheromone detection, Dulac and colleagues deleted the TRP2 gene; they then bred mouse strains that had two, one, or no copies of the gene. All of the animals reproduced as if nothing were amiss. But controlled introductions between individual mice revealed the effect of the missing gene.

    Male lab mice have a black-and-white worldview: They defend their cages aggressively from other males but put the moves on any females. The researchers dropped in an intruder mouse, observed the interaction, and monitored the nerves firing in the resident mouse's VNO. The team ensured that the intruders were giving off a strong pheromone signal by using either females in estrus or males that had been castrated (castrati are not aggressive and won't start a fight) and daubed with pheromone-rich urine from intact males.

    As expected, resident males with one or both chromosomal copies of TRP2 mounted introduced females. When a urine-daubed eunuch mouse was allowed entry, the males with TRP2 picked a fight. Male mice with no copies of the gene, however, tried to mate with either type of visitor. If offered both companions at the same time, the TRP2-negative mice spent just as much time trying to mate with the males as the females. Males without TRP2 also courted eunuchs that hadn't been spritzed with urine, suggesting that a pheromone signal isn't needed to enkindle mouse romance.

    The knockout mice aren't entirely peaceniks; they will fight back if provoked by other males. Their VNO neurons looked normal and fired if stimulated. But the neurons were quiet, compared to the same neurons in normal mice, when the knockout mice interacted with pheromone-doused companions. The researchers conclude that TRP2 is necessary for detecting pheromones that indicate whether a strange mouse is a male.

    If the mouse VNO controls basic behavior such as mating and fighting, and humans have remnants of this system, at what point in our evolutionary past did humans “overcome” being controlled by pheromones? USC's Liman, who studies the TRP gene family in primates, is trying to answer that question. But not everyone is pleased that humans have apparently largely abandoned pheromones when making mating decisions. “The perfume industry would like consumers to believe it's not vestigial,” Liman says.

    According to many researchers, the fact that one gene has such a marked effect on sexual behavior was a surprise. Says neurobiologist Charles Zuker of the University of California, San Diego, “I would have expected that the sexual identity of a mate was not solely determined by one pheromone cue—mating is so extraordinarily important biologically.” The bohemian mice seem to agree: Love is fundamentally more important—biologically speaking—than war.


    Planned Reactor Ruffles Global Feathers

    1. Richard Stone

    VIENNA— Western officials are raising safety concerns over Myanmar's plans to build its first nuclear reactor. The small reactor would produce medical isotopes and test the feasibility of bringing nuclear energy to the poverty-stricken country, formerly known as Burma. It would also give Russia, which would supply the reactor and technical support, a larger presence in the region.

    A groundbreaking ceremony was scheduled for last week at a military complex near Magwe, a central region bearing Myanmar's richest uranium deposits, a U.S. Defense Department official told Science. The reactor would have a capacity of 10 megawatts and cost roughly $25 million. The Myanmar government confirmed privately to the International Atomic Energy Agency (IAEA) that more than 200 of its scientists and technicians have received nuclear training in Russia in recent months.

    Both the Soviet Union and the United States built research reactors around the world during the Cold War as part of a competition to promote the peaceful use of atomic energy. Some reactors became huge proliferation risks. During the Vietnam War, for instance, U.S. Special Forces tried to recover plutonium from a U.S.-made research reactor in the south that had been seized by communist troops—only to find that the fuel was already gone, says Stanford University's George Bunn, a former negotiator on the nuclear nonproliferation treaty. In recent years the United States has striven to help other countries convert research reactors that use weapons-grade nuclear fuel into ones that consume low-enriched uranium (LEU).

    Role model?

    International authorities hope Myanmar will meet the same safety standards followed by U.S. reactors.


    The U.S. government reviews U.S.-led projects to build reactors in foreign lands but has little sway over deals that other countries strike. In a press conference last week, a State Department spokesperson said the government expects Myanmar “to not produce unsafeguarded fissile material.” According to the Defense official, the government is worried that the reactor could increase the threat of radioactive materials falling into the hands of terrorists (see p. 777).

    Other analysts generally discount the proliferation risk. “From the size of it, it looks like an LEU reactor,” says Fred Wehling of the Center for Nonproliferation Studies in Monterey, California. Indeed, as a member of the Southeast Asia Nuclear Weapon-Free Zone, Myanmar “has accepted significant restrictions on nuclear-related activities” under an agreement that allows member states to pursue peaceful research, says Ralph Cossa, president of the Center for Strategic and International Studies' Pacific Forum. “I see very little real threat,” he says, “especially if the Russians insist on proper safety and monitoring procedures.”

    Whether that will happen is an open question. An IAEA team that visited Myanmar in November 2000 concluded that the country's radiation protection infrastructure was “not meeting the expected standards,” says an agency official, and followed up with a list of improvements needed to operate the reactor safely. The agency has not yet received a response. Myanmar's foreign ministry declined to make officials available for interviews and referred inquiries to a press conference transcript on the government's Web site.

    Perhaps most intriguing is what the deal may mean for regional stability. “It shows some concern [in Myanmar] with not getting too dependent on China, as well as Russia's efforts to increase its own footprint in Southeast Asia,” says Cossa. Others add that Russia's cash-strapped energy industry could be tempted to strike additional deals if the Myanmar regime deems nuclear power vital to the country's future.


    Census Case Tests Statistical Method

    1. Charles Seife

    Justices of the U.S. Supreme Court are hiking up their robes in preparation for another march through the political swamp of reapportionment. But more than a congressional seat may be at stake. In agreeing last week to hear a case (Utahv. Evans) stemming from the 2000 census, the high court will also be examining the legality of a time-honored statistical method for filling in the blanks.

    The method, called “hot-deck imputation,” goes back to the dawn of the computer age, says statistician Joseph Schafer of Pennsylvania State University, University Park. The term refers to the deck of punch cards that the Census Bureau once used to store data. When a statistician came across a card that was improperly or incompletely filled out, officials were forced to “impute”— essentially make an educated guess—about the missing data. One technique involved finding a household as similar as possible to the one with missing information. “Cold-deck” used cards from the previous census to do the imputation, whereas “hot-deck” drew on the then-current census.

    Utah is the latest in a series of legal battles over census methods. In 1999, the Supreme Court declared in a 5-4 ruling that statistical “sampling”—performing a detailed survey of a subset of the population and using those data to compensate for flaws in the general census—could not be used to apportion congressional seats. Utah contends that it unfairly lost a congressional seat to North Carolina, because hot-deck imputation should not be allowed under the no-sampling rule. The Census Bureau and Commerce Department, two of several defendants, argue that imputation is consistent with the “actual enumeration” clause in the U.S. Constitution. They also argue that it is distinct from sampling.

    Fighting mad.

    Old-fashioned punch cards lend their name to a statistical technique that is now before the court.


    “I've been wrestling with this [question] for a while,” says Alan Zaslavsky, a statistician at Harvard Medical School in Boston. “It has some features in common, but it's not what I usually think of when I think of sampling.” One complicating factor is that the census surveys the whole population rather than taking the more common approach of selecting a subset and then drawing inferences about the rest of the population. Clearly, that common use of sampling differs from imputation, which is used to draw conclusions from nonresponses and incomplete data.

    But getting rid of imputation would cause immense problems, according to Schafer and Zaslavsky, without obvious solutions. “Throw out imputation, and you throw out a lot of things,” says Schafer. “You toss out editing of the data and making sure that it satisfies consistency checks. Now, if someone puts down an age of 145, that's not going to fly. [But] if imputation is not acceptable, what are we to do then?” Throwing the data away would be an implicit imputation, says Zaslavsky. “Another assumption is that there's no population in homes that don't respond. That doesn't seem like a likely story. But if you say you can't do any imputation, that's effectively what you're assuming.”

    Last year three Utah judges ruled that imputation was acceptable. If the Supreme Court disagrees, it could be difficult to impute the Census Bureau's strategy for 2010.


    Report Backs Collider and an Expanded Field

    1. Charles Seife

    U.S. high-energy physicists want to redefine their field to include the entire cosmos. But they also have a very down-to-earth proposal for the government to back their next multibillion-dollar machine.

    “Participation in a linear collider is absolutely essential to the field,” says Barry Barish, a physicist at the California Institute of Technology in Pasadena and co-chair of the High Energy Physics Advisory Panel (HEPAP) subcommittee that drafted a 20-year road map with the Next Linear Collider (NLC) at its center. Whether or not it is the host, the panel argues, the United States must have a central role in building the machine. HEPAP adopted the report this week at a meeting in Washington, D.C.

    World in collision.

    The site of the Next Linear Collider is up for grabs; Batavia, Illinois, and Hamburg, Germany, are seen as the front-runners.


    Current cost estimates put the linear collider, which will smash electrons and antielectrons together at about half a trillion electron volts of energy, at about $5 billion to $7 billion. The HEPAP plan, which ratifies the consensus hammered out at Snowmass, Colorado, last summer (Science, 27 July 2001, p. 582), calls for the host country to pay two-thirds of the bill. The panel recommends that the United States bid to host the facility at a site using existing expertise at a national laboratory such as Fermi National Accelerator Laboratory in Batavia, Illinois, or the Stanford Linear Accelerator Center in California.

    Hosting NLC would require an annual high-energy physics budget some 30% higher than the $716 million now being spent by the Department of Energy (DOE). Building it overseas—most likely in Germany or Japan—would mean only a 10% boost. If the budget doesn't increase by at least 10%, says Barish, “we can't have a significant role in the linear collider.” James Decker, acting director of DOE's Office of Science, declined to comment on the budgetary implications of the proposal. “But let me assure you that we will take the plan very, very seriously,” he says.

    The road map also calls for a panel to rank upcoming experiments and discusses opportunities in particle physics other than those presented by colliders, such as neutrino physics, symmetry-breaking experiments at B factories, and cosmological tests of the mysterious “dark energy” that seems to be causing the universe to expand faster and faster. This agenda reflects a shift in the definition of particle physics from a field concerned with the fundamental building blocks of matter and the forces that act upon them. “That's narrower than what the field is today,” says Barish. The road map dubs particle physics the study of “matter, energy, space, and time,” which encompasses studying dark energy and extra dimensions, as well as traditional topics such as quarks and leptons.

    The breadth of the report should mollify high-energy physicists who are not associated with collider work and are afraid of being left behind, but the panel made clear that it would not sacrifice the next collider in favor of new ventures. “[NLC] promises to be one of the great scientific adventures of our time,” says Jonathan Bagger, a physicist at Johns Hopkins University in Baltimore, Maryland, and co-chair of the subpanel. “It's a rare opportunity and one that should be seized by the U.S.”

  11. 2003 BUDGET

    Bioterrorism Drives Record NIH Request

    1. Jocelyn Kaiser

    President George W. Bush will propose another record increase for the National Institutes of Health (NIH) next week in his 2003 budget request to Congress. The additional $3.7 billion represents a 16% rise and would complete a long-cherished 5-year doubling of NIH's budget, to $27.3 billion. But the victory isn't entirely sweet: More than half of the new money would go to combat bioterrorism and to cancer research, meaning that most of NIH's 27 institutes will likely get much smaller increases than their supporters had hoped.

    Administration officials released the good news about NIH, which fulfills a campaign promise, some 10 days before the president's overall budget is unveiled on 4 February. The 2003 request follows on a $3 billion rise for the current fiscal year, to $23.6 billion.

    The $1.5 billion jump for bioterrorism is a sixfold increase over the current $300 million being spent by NIH. And the big winner on the Bethesda, Maryland, campus is the National Institute of Allergy and Infectious Diseases (NIAID), which would receive 95% of the bioterrorism dollars, according to its director, Anthony Fauci. The projects would include basic research, such as genome sequencing of bioterrorism agents, as well as work on anthrax vaccines and construction of new high-containment facilities that would let researchers at NIH and elsewhere work on dangerous pathogens (see table). The agency's budget is now $2.4 billion, and the 2003 request may bump up other research areas as well.

    View this table:

    The other area the Bush budget favors, as pledged during the campaign, is cancer-related research, most of which is done by the National Cancer Institute (NCI). Cancer research across NIH would receive $5.5 billion, a nearly 13% increase.

    The doubling “is really good news” in an era of rising defense spending and a return of deficit spending, says budget analyst Dave Moore of the Association of American Medical Colleges. However, institutes other than NCI and NIAID may get as little as an 8% increase, biomedical groups expect. “I think it will be of concern to some people in the community,” says Moore, noting recent annual rises of roughly 13%.

    NIH acting director Ruth Kirschstein sees the budget as affirmation of the doubling campaign. “The president considers this a doubling, and as far as I'm concerned, it's a doubling. … We are very pleased.” The next step, as always, is up to Congress, where legislators are expected to be similarly generous.


    MgB2 Trades Performance for a Shot at the Real World

    1. Robert F. Service

    Magnesium diboride can't match the properties of high-temperature superconductors. But it is far easier to work with, which may prove even more valuable in the end

    To the ancient Greeks, he was known as Hermes, the messenger god. Updated to rule over modern scientific disciplines, this fleet-footed deity and renowned trickster would undoubtedly govern superconductivity. The phenomenon owes its existence to agile electrons that sprint through superconductors unencumbered, without the resistance that plagues electrons in standard wires. But researchers in the field sometimes feel like the butt of an ongoing cruel joke: Every tantalizing new superconductor invariably turns up a serious flaw.

    Perhaps Hermes has gone soft, or he's developed a little sympathy for mortals. Last year, a team of Japanese researchers discovered a new superconductor called magnesium diboride (MgB2) that so far seems to be holding its luster. After just a year of working with this powdery black compound, researchers are already pushing MgB2 headlong toward applications. They've forged it into long wires, a feat some high-temperature superconductors (HTS) have yet to match. Two new companies are now working to commercialize the results. “The area of hot activity has shifted from basic research to applied,” says Paul Canfield, a physicist at Iowa State University in Ames. “This is a healthy transition and incredibly rapid.”

    But as heady as this progress seems today, MgB2 still faces numerous hurdles before anyone makes any money from it as a superconductor. And because commercializing superconductors is never as easy as it seems, Hermes may still have the last laugh.

    He certainly had the first. Many theorists believed for years that conventional metallic alloy superconductors would never superconduct above 20 kelvin. Around that temperature, researchers thought, the added heat would create vibrations in the material's crystalline lattice that would break up pairs of electrons that surf through the material together, the signature of superconductivity. But MgB2 turns out to be extremely efficient at linking electrons in pairs and superconducts all the way up to 39 K. That's well below the 138 K of the record HTS material, a copper oxide-based compound. But it's potentially still useful for widespread applications, such as making transformers used by the electric power industry. And unlike the copper oxides, MgB2 is simple to make, cheap, and relatively easy to turn into wires.

    Practical advantage.

    Unlike some rival materials, crystalline superconductor MgB2 is readily made into wire.


    Fifteen years after the discovery of the copper oxides, companies are only now nearing the commercialization of the first generation of HTS wires, generators, and motors. But even now, some experts question whether these products will survive the marketplace. First-generation HTS wires are made by taking ceramic grains composed of bismuth, strontium, calcium, copper, and oxygen (BSCCO) and cladding them in an expensive silver sheath. That coating drives up the cost dramatically. Using one standard measure that's based on the price of transmitting current over a meter, first-generation HTS wire checks in at an exorbitant $200 for a kiloampere per meter (kA/m). Experts expect large-scale manufacturing to drive the cost down to $50 a kA/m.

    The upside for HTS wire is that it works at 77 K. That allows for the use of cheap liquid nitrogen to cool the material to its operating temperature, saving companies a bundle of money. By contrast, low-temperature superconductors, such as niobium-titanium (NbTi), must be cooled to 4 K, which demands expensive liquid helium. However, that alloy costs only $1 kA/m to produce, and the price is expected to drop even further.

    MgB2 splits the difference on both counts. Although the superconductor operates at temperatures up to 39 K, it would likely be chilled to between 20 and 30 K, because superconductors tend to work better below their peak temperatures. But even 20 K is good news for many applications, because it allows the use of a cryocooler, a type of no-hassle refrigerator that can be plugged into the wall. “There's a big difference [in cooling costs] between 4 K and 20 K,” says Mike Tomsic, president of Hyper Tech Research, a Columbus, Ohio-based company attempting to commercialize MgB2.

    Moreover, magnesium and boron, MgB2's starting materials, are dirt cheap. And instead of requiring a silver cladding, MgB2 works just fine with a sheath of cheap metal, such as iron. At a meeting of the Materials Research Society in December 2001, superconductivity expert Paul Grant of the Electric Power Research Institute in San Francisco estimated that MgB2 wire will eventually match NbTi's cost of $1 kA/m, making it cheaper than copper wire. After factoring in all the costs—not just the wire but also the cooling equipment and electrical losses—Grant found that an MgB2 transformer would soundly defeat not only an HTS version but also a standard copper wire system today (see figure, p. 787).

    Cheap shot.

    Transformers made with MgB2 wire should cost less than those based on copper or high-temperature superconductors.


    Transformers are just the beginning. “I think it's good enough right now for motors and generators in the 20-to-30-K range,” Grant says. Predictions that the annual market for HTS-based wire and machinery will approach $2 billion by 2025 suggest that there's plenty of room for sales of MgB2.

    MgB2 is well on its way to fulfilling this potential. For starters, groups in the United States and Italy recently reported making MgB2 wires as much as 60 meters long using techniques very similar to that used to make BSCCO wires. By contrast, after 6 years of work on the second-generation HTS wire technology—made from a mix of yttrium, barium, copper, and oxygen—researchers have only managed to make wires about 11 meters long (see sidebar).

    Researchers can thank basic physics for this rapid progress, says Giovanni Grasso, a MgB2 and BSCCO wiremaker at the National Institute for the Physics of Matter in Genoa, Italy. Grasso points out that electron pairs travel through different superconductors at different distances from one another, an association known as their coherence length. In MgB2, electrons have a coherence length of 5 nanometers. That's large, particularly compared to the unit cell of an MgB2 crystal, which measures just 0.3 nanometer across. The result is that in MgB2, electron pairs stretch over 10 unit cells in the material. If there is a defect in the material or a jagged boundary between two grains, the electron pairs are effectively so big that, like an SUV encountering a pebble, they simply don't notice it.

    High-temperature superconductors, by contrast, suffer from the opposite problem. In BSCCO, for example, the unit cell is over 10 times larger than the coherence length, says Grasso. For electrons to pass unencumbered through a BSCCO wire, successive grains must be tightly packed with their crystalline axes closely aligned to prevent electron pairs from getting stuck at every pothole they encounter. “This is very difficult to achieve,” Grasso says.

    The challenge for MgB2 wiremakers is to improve its ability to carry current. Many groups are working overtime to find answers. Last May, for example, a team at Lucent Technologies' Bell Laboratories in Murray Hill, New Jersey, reported that iron-coated MgB2 wires can carry 35,000 amps per square centimeter of wire (A/cm2), a value still well below the 80,000 to 100,000 A/cm2 needed for many applications. In July, five groups reported even higher current-carrying values for the material, up to 200,000 A/cm2 at 25 K in a magnetic field of 1 tesla. The finding suggests that MgB2 wires might soon meet real-world demands.

    Another challenge is that MgB2's current-carrying capacity appears to wither if the magnetic fields get much stronger. But early results hold out hope that tinkering with the material will allow researchers to maintain carrying capacity. In the 31 May 2001 issue of Nature, two groups reported that they had added defects to their MgB2 wires to help trap magnetic eddies called vortices that can course through the material and sap its ability to carry current. So far, eliminating vortices altogether has proven impossible. But researchers know that if they can pin them down and keep them from moving, the wire will still superconduct. To do so, one team, led by physicist Chang-Beom Eom of the University of Wisconsin, Madison, substituted oxygen for some of the boron; the other team, led by David Caplin of University College London bombarded its film with protons.

    Iowa's Canfield and others predict more results in the coming months but not necessarily more announcements. “This is where you get to all the patentable stuff,” says Philip Sargent, president of Diboride Conductors, a U.K.-based start-up intent on commercializing MgB2. Sargent insists he's heard of unpublished work describing improvements. But he declines to specify just what those are.

    Whether these rumored improvements will be enough to vault MgB2 over its commercial hurdles is anyone's guess. But not everyone is convinced that the newest superconductor will make it, particularly folks at established HTS companies, such as American Superconductor Corp. (ASC), that are making HTS wire. “It's enough to monitor [their progress] extremely closely,” says ASC's chief technical officer Alex Malozemoff. But so far he hasn't seen anything to make him want his company to switch gears. “An army of researchers has pounced on the material and doped it with everything under the sun. But so far the [superconducting temperature] has only gone down,” he says.

    Perhaps that's because Hermes is just up to his old tricks. Whether MgB2 makes it to market may depend on how many more impediments the capricious deity throws in the path of superconductivity researchers.


    YBCO Confronts Life in the Slow Lane

    1. Robert F. Service

    While MgB2 researchers are already gearing up efforts to make kilometers of wire, researchers working with the top-of-the-line high-temperature superconductor (HTS) are struggling for every meter. “We aren't making the progress we had hoped for in 1995,” says Paul Grant, a superconductivity expert at the Electric Power Research Institute in San Francisco, California. Adds physicist Steve Foltyn of Los Alamos National Laboratory in New Mexico, “There is some sense of impatience.”

    In 1995, Los Alamos researchers reported that a brittle ceramic superconductor composed of yttrium, barium, copper, and oxygen (YBCO) deposited atop a thin metal tape could carry a phenomenal 1 million amperes of electrical current per square centimeter of cross section (A/cm2) (Science, 5 May 1995, p. 644). What's more, it worked at a balmy 77 kelvin and in strong magnetic fields, conditions that no other superconductor, including MgB2, has even approached. Dreams of widespread use of superconductors seemed within reach.

    The trouble was that the early YBCO tapes were only about 4 centimeters long, not the kilometers needed for applications. Six years later, the Los Alamos team and most others still can't make tapes longer than a meter. Last year researchers at Fujikura, a leading superconductivity research company outside Tokyo, Japan, reported an 11-meter tape. “That's impressive,” says Balu Balachandran, a physicist at Argonne National Laboratory in Illinois. But he and others acknowledge that even these tapes aren't close to being ready for practical uses. “You can't make a motor, transmission line, or transformer out of a few meters of tape,” Foltyn says.

    Power coil.

    Los Alamos physicists Steve Foltyn and Paul Arendt show off a meter-long strip of YBCO tape.


    The problem lies in the physics of YBCO. Pairs of electrons surfing through the superconductors breeze along its crystalline lattice, but YBCO crystals break up into tiny grains when the material is deposited on metal tapes. And electrons have trouble hopping between neighboring grains unless the adjacent crystalline axes share nearly perfect alignment. Any effort to scale up the technology must preserve this alignment between grains throughout the wire. “It's like trying to make a mile-long single crystal,” Grant says.

    That's not the only problem. To make real world wires, engineers also have to lay down a conventional conductor atop the YBCO to carry the current should part of the superconductor stop working; otherwise, all the electricity would turn to heat and instantly vaporize the wire. Yet when researchers add conductors alongside YBCO, the superconductor mysteriously loses some of its ability to carry current.

    Researchers also face the challenge of making thicker wires. So far the best YBCO tapes contain only a 0.3-micrometer-thick layer of YBCO. Even though they are great conductors, such skimpy threads of superconducting material don't carry much overall current. Yet it's been much harder than expected to bulk up YBCO. “When you make the superconductor thicker, the critical current declines,” says Los Alamos physicist Dave Christen. “That's one of the bottlenecks right now.”

    As if such technical troubles weren't enough, the window for some applications of these second-generation HTS wires may be closing. Transformers and other components of the electric utility grid are aging so rapidly, Grant says, that companies may not be able to wait for researchers to perfect YBCO wires. And because many such components last for decades, “if we can't get the job done fast enough we may miss some opportunities,” Christen says.

    Of course, that may be good news for MgB2's proponents. YBCO's loss could be their gain.

  14. CANADA

    New Money Widens Gap Among Universities

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    New programs are pumping more than a billion dollars into academic research. But a relative handful of universities are getting most of the money

    OTTAWA, CANADA— When Robert Birgeneau decided 18 months ago to leave a deanship at the Massachusetts Institute of Technology (MIT) to become president of the University of Toronto (UT), the chance to move up the academic ladder was only part of the reason. Born and educated in Canada, Birgeneau also was attracted by the opportunity to compete for billions of dollars that the government is shoveling into targeted programs aimed at creating an MIT or two north of the 49th parallel. But although Birgeneau and other top academic administrators praise the new programs as a “crowning achievement” of the current Liberal government, some educators worry that the government is purchasing excellence for a few at the expense of the majority of institutions, faculty, and students. A forthcoming government policy paper on innovation promises to provide a forum for this debate.

    Three new programs have changed the Canadian academic landscape. The $600 million Canada Research Chairs program was created to stem an ostensible brain drain to the United States and Europe (Science, 22 October 1999, p. 651). The Canada Foundation for Innovation (CFI), with an initial endowment of $520 million, is intended to renovate aging buildings, laboratories, and other university facilities (Science, 25 September 1998, p. 1933). And the newly restructured Canadian Institutes of Health Research (CIHR), whose budget has doubled in 3 years to $353 million, hopes to spur biomedical advances that will strengthen the nation's economy and improve public health (Science, 21 December 2001, p. 2452). Funds for the chairs are awarded on a competitive basis using a formula that favors large institutions with a successful track record in attracting grants, whereas those with medical schools have a decided advantage in competing for CFI and CIHR awards.

    “The combination of [these three programs] was a critical factor in my decision to return,” says Birgeneau, who has inherited an institution bursting at the seams as a result of this new federal largesse. The university already has won $53 million for some 120 CFI projects, including a state-of-the-art crystal growth facility. UT also will be able to anoint 270 faculty stars over 5 years, with generous funding for their labs (Science, 23 June 2000, p. 2112).

    But not everybody is happy with the idea of the rich getting richer (see graphic). “The biggest 10 research universities are now getting close to two-thirds of all the money,” says Jim Turk, executive director of the Canadian Association of University Teachers, with UT at the top. That imbalance shatters a cherished Canadian ideal of providing equal access to an excellent education regardless of the nature of the institution, says Turk. Whereas UT will get 270 new chairs, for example, nearby York University, a prominent liberal arts institution, will get a paltry 32.

    Getting richer.

    The same 15 Canadian universities get the lion's share of infrastructure grants and chairs.


    The competitive funding pushes most universities even farther behind in their efforts to keep up with the rising costs of research, Turk says. A recent survey of growth rates among the 113 largest academic libraries in North America, for example, shows that Canadian universities occupy seven of the 11 bottom slots.

    The chairs program has also created what Michael Stevenson, president of Simon Fraser University in Burnaby, British Columbia, calls “invidious distinction and irritation” within faculty ranks. And data suggest that it may not even achieve its desired end of deepening the academic research pool by attracting talent from outside Canada or from industry. Some 370 of 448 accepted chairs have been appointed from within institutions, and 29 involved hiring someone at another Canadian university. Fewer than 10% of the hires (41) come from abroad, and a scant 2% (eight) made the shift from industry. Stevenson says there is no evidence that the occupants of the new chairs were about to fly the coop, and he frets that the bulk of future appointments will involve more interuniversity poaching. “The net effect will be that we are paying the same people a lot more for no discernible improvement in output,” he predicts.

    Stevenson also believes that people in the social sciences and some natural sciences are disadvantaged by CFI's requirement that another body put up 60% of a project's cost, because it's generally much harder for natural and social scientists to find external sources of funding. Accordingly, the social sciences have garnered only 2% of CFI funding (a mere four projects), although they represent 53% of the country's faculty members, whereas the 18% in the health sciences have won 45% of the pot. This imbalance has exacerbated resentment among social scientists, says Humanities and Social Sciences Federation of Canada president Patricia Clements: “Of course, there is a division on the campus. While the Minister of Justice was visiting the clean room in the new engineering building, people in the humanities center are phoning the janitor for the fifth time to fix the dripping tap.”

    Growth industry.

    Postdoc Shuichi Wakimoto and the floating zone crystal growth furnace, one of several new University of Toronto facilities with funding from CFI.


    Birgeneau rejects that gloomy assessment of the programs' impact. Even if tiering occurs, he says, peer review ensures that the money is well spent. In addition, he argues that a rising tide lifts all academic boats: “The fact is that, because of these increased resources, we're attracting better graduate students and providing them with a better graduate education.” The result, Birgeneau says, is that even “second-tier institutions” can choose from a wider selection of well-trained faculty members.

    But Frederick Lowy, head of Concordia University in Montreal, feels that the gap has grown large enough and that the government now needs to strike “a better balance between capacity building and rewarding of existing research strength.” Otherwise, he warns, “those universities without research potential will not be able to provide as good an education as those [that] do.”

    He and others had hoped to address such concerns during forthcoming national consultations on Ottawa's long-overdue white paper on innovation. But they were delayed after the chief sponsor, Industry Minister Brian Tobin, unexpectedly packed his bags last month. Tobin's successor, Allan Rock, is expected to pick up the project this spring.


    Beautiful Mind's Math Guru Makes Truth = Beauty

    1. Dana Mackenzie*
    1. Dana Mackenzie is a writer in Santa Cruz, California.

    As mathematics consultant to the hit film about a troubled genius, Dave Bayer learned to balance a whole new set of equations

    Early in the film A Beautiful Mind, Russell Crowe, playing the brilliant young mathematician John Forbes Nash, strides into a classroom at the Massachusetts Institute of Technology to teach his first undergraduate class in vector calculus. The 1950s-era students are wearing coats and ties; Nash, who hasn't even bothered to don a shirt over his undershirt, makes no effort to hide his resentment of them and of his teaching duties. Hurling the assigned textbook into a wastebasket, he writes a series of equations on the blackboard and announces that the rest of the course will be devoted solely to solving the problem they represent—a task, he says, that will take some of them “all your natural lives.”

    It's a pivotal moment. The student who later rises to the challenge—unsuccessfully—is Nash's future wife Alicia (played by Jennifer Connelly), who will nurse him through 3 decades of mental illness and share his triumph when he receives the Nobel Prize. Like many key scenes in the film, though, the one that launches their journey together mixes fact and invention. Though Alicia Larde did take John Nash's advanced calculus course, he never threw out such a challenge.

    So when director Ron Howard needed a mathematical problem for the scene, he couldn't just pluck one out of someone's 50-year-old class notes. Instead he asked Dave Bayer to make it up—to invent the math that Nash would have written if such a scene had actually taken place. Some mathematicians or math historians might have balked, but for Bayer—an algebraic geometer at Barnard College in New York City who was moonlighting as A Beautiful Mind's mathematical consultant—it was all in a day's work. “For me, movies are dream sequences,” Bayer says. “But even the wildest dream sequences are anchored in reality.”

    Bayer's task was to forge the anchors. It is a job, Bayer says, that Howard and his team took very seriously. “They have found that real life is much more surprising than anything people can make up,” he says. “Audiences can tell when the mathematics is real, and they want it to be real.”


    A Beautiful Mind depicts a man immersed in mathematics.


    If a scientific consultant does his or her job well enough, the viewers won't even notice it. And indeed, the ambitious portrayal of mathematics in A Beautiful Mind has done nothing to prevent audiences from appreciating its compelling love story. It has been one of the five top-grossing movies in the United States every week since its release, and on 20 January it won four Golden Globe awards, including best drama.

    Bayer came to the picture by a circuitous route. In 2000, he had written a review of the Broadway play Proof for Notices of the American Mathematical Society. As a movie and theater aficionado, he wanted to draw his colleagues' attention to a play that treated mathematics seriously. The review found its way to Howard, who liked what he read. The director offered Bayer an interview, which went well enough that he hired him on the spot. Once Bayer was on board, Howard outlined his main concerns about the role of mathematics in the film: How could such an intensely internal subject be captured visually? Could mathematics reflect Nash's descent into mental illness and his slow emergence?

    Answering those questions took months of intense effort. “It may look glamorous sitting around a table at the Golden Globes,” Bayer says (he himself watched at home), “but the truth is an amazing amount of work goes into every second of film.” Between February and June 2001, when the movie wrapped, Bayer put in several hundred hours of work on top of his regular teaching schedule. The insane hours are part of the job, he says; he credits his fascination with moviemaking for carrying him through: “If you didn't have that fascination, you'd walk on day three.” His work included writing every one of the countless formulas and computations that cover blackboards and windows throughout the film, apart from a few that Crowe wrote on-camera. Bayer also consulted on set design and props.

    The classroom scene was the pièce de résistance, Bayer says. He approached it as if he were the actor playing Nash, by putting himself into the character's shoes. “This is someone who really doesn't want to teach the mundane details, who will home in on what's really interesting,” he says. Such a mathematician, Bayer says, would have posed a problem that led into a field of active research, perhaps something that he was thinking about himself—a question that let him “feel the gravitational pull of deep ideas.” Yet at the same time, the problem had to be accessible enough so that Connelly's character, a bright physics student, might concoct a plausible, although incorrect, solution.

    The problem Bayer finally chose (see photo) was a more complicated version of a classical physics problem: determining whether a static electric field (the F in lines one and two) necessarily has a potential function (indicated by g). If the “electric field” is allowed to be infinite or simply nonexistent at certain points (collectively indicated by X), the question becomes physically unrealistic but mathematically very rich. The answer depends not only on the geometry of the set X, but also on one's assumptions about the field F, as the fictional Nash explains to Alicia rather brusquely when she offers her stab at a solution.

    Mathematical haiku.

    Dave Bayer's terse calculus problem plays a key role in the film.


    Bayer acknowledges that he was also playing to the cognoscenti. “For years I had heard mathematicians boasting that they could solve the blackboard problem in Good Will Hunting [a 1997 movie about a mathematical prodigy] while it was still on the screen,” Bayer says. “It would thrill me to no end to have someone with a lot of hubris whispering to their neighbor 'I know what the answer is' and then have it be wrong in the same way as Jennifer's.” Ideally, though, Bayer would like even the most savvy viewers not to think about it. “If you put enough effort into making the math credible, at a certain point you win the war,” he says. “They're caught up in the movie and barely have time to recognize it's a problem in de Rham cohomology.”

    More subtly, Bayer also orchestrated mathematics to trace the ups and downs of Nash's struggle with schizophrenia. In the movie, at the time of his breakdown Nash is working on a famous still-unsolved problem called the Riemann Hypothesis, and he continues to work on it as he recovers. Bayer carefully crafted Nash's work so that in the depths of his illness it verges on arithmetical gibberish but later becomes a plausible attack on the problem.

    Bayer's advice even extended to acting. In the scene in which Nash breaks down while delivering a public lecture on the Riemann Hypothesis, Bayer noticed that the extras in the audience looked bored. It was understandable, after an early start and long hours of costuming—but it was wrong. If the onlookers thought Nash might really have a proof of the Riemann Hypothesis, they should have started out on the edge of their seats, only to grow increasingly bewildered as they realized that Nash was babbling. Bayer moved to alert one of Howard's assistants. “Ron, who has eyes in the back of his head, was there like a shot, asking me, 'What is it, Dave?'” Bayer recalls. “Then he gave the extras better directions than I could ever have on how to play the scene.”

    As an unexpected bonus, Bayer wound up on camera himself. Near the end of the movie, he appears as one of Nash's fellow professors who approach him in the Princeton faculty club and lay their pens on his table as tribute. The “pen ceremony” scene is fiction, but it is one of the most moving scenes in the movie and, Bayer says, a beautiful example of the way a good director creates emotional truth. Other “professors” in the scene include members of the film crew. Their sentiment was genuine, Bayer says, the scene their tribute to Crowe for the bravura performance they had just seen him give.

    Bayer himself expected little recognition for his work beyond a mention in the closing credits. Instead, he says, he had braced himself for criticism from his fellow mathematicians. “He realized he might get pilloried,” says Henry Pinkham, a mathematician and dean of graduate studies at Columbia University in New York City, who consulted for the 1997 film The Mirror Has Two Faces, about a lovelorn Columbia University mathematics professor. “He saw the script and the fact that it took serious liberties with Nash's story. He did a fantastic job, given the constraints.” Indeed, although some critics grumble that A Beautiful Mind exaggerates the competitive atmosphere of postwar Princeton and leaves out important parts of Nash's life and work, the mathematics in the film has come through peer review with flying colors.

    The best-informed critic of all seems to be satisfied. John Nash, who has seen A Beautiful Mind several times, wrote to Bayer that he appreciated the “bona fide sophistication” of the math in the movie—although he added that in the film's portrayal of his later work, the fictional Nash seems to know some things that “the real Nash (me)” never did.


    Humiliated Lab Fights to Save Face

    1. Michael Balter

    The U.K.'s Institute for Animal Health was ridiculed over an alleged sample mix-up. But the facts of the case are far more intriguing

    EDINBURGH— Down a quiet corridor here at one of the United Kingdom's premier research labs sits a tall, padlocked freezer containing a handful of samples of brain tissue. The samples are all that's left of an experiment that went badly wrong—and yet no one really knows why. But solving the riddle could have important implications for public health.

    Last October, scientists at the Institute for Animal Health's (IAH's) Edinburgh branch were about to publish results suggesting that British sheep might have become infected with bovine spongiform encephalopathy (BSE), or “mad cow disease,” in the early 1990s. The human form of BSE, variant Creutzfeldt-Jakob disease (vCJD), is an invariably fatal neurodegenerative disease that has already caused more than 100 deaths in the United Kingdom. Finding BSE in sheep could raise the specter of an alarming new reservoir of infection. “The potential implications for public health if BSE were found in sheep would be large,” notes John Krebs, chair of the Food Standards Agency (FSA). But the authors pulled the paper at the last minute after an independent lab concluded that the brain extracts they had tested—which the team believed were from sheep—actually came from cattle (Science, 26 October 2001, p. 771).

    British newspapers derided “government scientists” who supposedly couldn't tell the difference between sheep and cattle—a hard blow for a team known for many key discoveries, including the critical finding in 1997 that vCJD is caused by BSE. IAH's plight worsened the following month when two government-ordered audits identified alleged deficiencies in how the samples had been labeled. And the timing could not have been worse: The U.K.'s Biotechnology and Biological Sciences Research Council (BBSRC), which provides a large chunk of IAH's core funding, had just begun examining the institute's requests for grant renewals. For weeks after the story broke, IAH staff lived in fear of cutbacks and job losses.

    Instead, BBSRC has come out in strong support of the institute, while acknowledging that the lab may well have made some errors. IAH “will continue to receive core support for its excellent research,” BBSRC spokesperson Andrew McLaughlin told Science. And outside researchers insist that the episode has not significantly tarnished IAH's overall reputation. The group “can look back at 4 decades of exceptionally productive research,” says Adriano Aguzzi, a neuropathologist at the University of Zürich, Switzerland. Although the Edinburgh team concedes it may have made mistakes, including possibly mixing up the samples, it believes that there has been an unfair rush to judgment, particularly by the British media. The researchers are challenging the findings of the independent lab—the Laboratory of the Government Chemist (LGC)—and have launched their own investigation into the decade-long chain of events that led to the fiasco. “We are confident in our work and in our standards,” says IAH researcher Moira Bruce. LGC, meanwhile, stands by its findings.

    One experiment too many?

    Much of the reputation of the Edinburgh lab, one of three IAH maintains in the United Kingdom, rests on its development of “strain-typing” techniques to discern variations in the infectious agents that cause BSE and other spongiform encephalopathies, such as scrapie and CJD. Most researchers now believe that these diseases are caused solely by aberrant forms of proteins called prions.

    To identify the infectious agent, researchers feed or inject mice with brain extracts from humans or other animals suffering from prion diseases. The incubation time varies for each strain and, coupled with hallmark patterns of brain damage, enables researchers to tell many strains of scrapie apart; they can even distinguish vCJD from “sporadic” CJD, an extremely rare condition that's not linked to BSE. In a pivotal experiment in 1997, the group showed that BSE and vCJD behaved identically in strain-typing studies, providing the smoking gun that humans were getting sick from eating mad cows.

    Normally, the brain extract used to infect mice comes from a single animal, so the chances of a mix-up are slight. But to probe whether BSE was lurking in sheep a decade ago, the researchers had to resort to a pool of sheep brain material collected under less than ideal conditions—and for an entirely different purpose.

    Team player.

    IAH chief Chris Bostock insists the jury is still out on brain mix-up.


    In the early 1990s, after it became clear that the BSE epidemic had spread when cattle were fed so-called “meat and bone meal” (MBM) from slaughtered cattle and sheep, officials began to hypothesize that the high-temperature MBM rendering process may have transformed scrapie into a form that's infectious to cattle. To investigate how to inactivate these agents, government veterinarians collected 2867 brains from sheep apparently infected with scrapie and pooled the extracts. These early experiments, which were led by IAH microbiologist David Taylor—and which included a similar study of 861 BSE-infected cattle brains—were wrapped up in the mid-1990s after researchers found that some modified rendering techniques did seem to knock out the infectious agent. The leftover samples were stored in an IAH freezer.

    The team may now be wishing that the samples had stayed there. In 1997, however, IAH launched a new round of experiments with the pooled sheep brain samples to further probe whether BSE originated from scrapie. In the meantime, the government had become increasingly worried that BSE might be masquerading in the sheep flock as scrapie. This concern was bolstered by findings that sheep infected experimentally with BSE had scrapielike symptoms. In 2000, the U.K. Department for Environment, Food, and Rural Affairs (DEFRA) asked IAH to extend the studies to try to resolve whether sheep had been infected with BSE in the early 1990s, the height of the mad cow epidemic.

    Molecular biologist Chris Bostock, IAH's director, says that the alarm bells immediately began ringing. “We cautioned against this use of the samples, because of the uncertainty in their provenance,” he says. The team, he says, had long been concerned that the sheep samples might have become tainted with cattle brain extract when collected at veterinary slaughterhouses where both species were killed. If those cattle brains were themselves infected with BSE, the results of experiments could be difficult to interpret. Yet the leftover sheep brain pool offered the only opportunity to learn whether BSE had entered sheep a decade earlier. “There was no choice,” says IAH neuropathologist Janet Fraser. “You either used that or you didn't do the experiment.” IAH agreed to carry out the study, although it asked DEFRA to pay for independent testing for contamination.

    Conflicting data

    At this point, the team, despite its concerns about trace contamination, still believed that the samples it was working on were primarily sheep brain. But two audits commissioned by DEFRA after the affair broke last fall found deficiencies in the way the Edinburgh team had labeled and stored the samples since the early 1990s. One of the audits, carried out by the private firm Risk Solutions, suggested how a possible mix-up might have occurred when the second round of experiments began in 1997. The leftover sheep and cattle brain pools had been stored in the same freezer, and the technician working with Taylor told the auditors that the sample labels were not entirely clear.

    The Edinburgh team declined to make the technician available for an interview. Taylor—who retired 2 years ago—says that he can now only recall that there was “some form of identification that retrospectively one might consider to be potentially ambiguous.” But the team thought the matter was resolved after IAH's genetics team analyzed prion genes in the samples. When these came back with a DNA signature specific to sheep, the team assumed that it was working on the right samples.

    The rude awakening came last October, when a DEFRA-commissioned study concluded that IAH had been working on cattle brains all along. The devastating report came from LGC, an independent outfit that had been contracted to check one last time into IAH's concerns that the samples might harbor trace contamination. LGC used the highly sensitive polymerase chain reaction (PCR) to analyze DNA in the samples submitted to it by IAH. The news media trumpeted the stunning result: all cow, no sheep.

    Although neither of the government-commissioned audits was able to pinpoint what had gone wrong, the Risk Solutions auditors concluded that a sample mix-up when the experiment first began in 1997 was the most likely explanation for LGC's finding. IAH biochemist Robert Somerville, who took over the experiment after Taylor retired, agrees that this scenario is “plausible,” although he argues that it “does not explain” the in-house genetic test results pointing to sheep.

    Bostock insists that the jury is out on exactly what is in the samples. “It didn't surprise me that there was bovine material,” he says. “What was surprising and damaging was the claim that there was no sheep material.” Bostock and other IAH researchers contend that this conclusion is premature, especially because the samples were highly degraded after being processed for earlier experiments and had been thawed and refrozen repeatedly. They raise the possibility that if highly degraded sheep material had been contaminated with a tiny amount of better quality cattle extract, PCR would have picked up only the cattle signature.

    They're the ones with the woolly sweaters!

    The British media accused “government scientists” of getting their ruminants wrong.


    But LGC molecular biologist Helen Parkes, who supervised the PCR work, rejects this scenario. “We got a very good yield of DNA,” she says. “The levels we were seeing are not consistent with trace bovine contamination at all.” She confirms that LGC did not detect any sheep DNA. IAH is now conducting its own PCR tests.

    Right all along?

    Some outside researchers familiar with the findings of IAH's ill-fated study suggesting that sheep were infected with BSE posit another scenario: that IAH studied the right brains but submitted the wrong samples to LGC for analysis. The conclusion drawn in the unpublished paper was that strains similar to both scrapie and BSE were present in the brain pool. If IAH's genetic findings were valid, as Bostock asserts, and if contamination levels were low, the BSE-like strains must either have derived from scrapie strains resembling BSE or from BSE-infected sheep, the paper's authors concluded.

    The Risk Solutions auditors considered it unlikely that the wrong samples had been delivered to LGC. But Edinburgh researchers do say that the BSE “signal” they saw in the mice was atypical—and therefore could not have come from testing a pure-cattle sample. In previous experiments in which mice were inoculated with BSE, the animals were easily infected and showed a characteristic BSE pattern. But in this study, the scientists had to reinject the infected mice brains into a second group of mice before the BSE pattern emerged.

    This finding suggests that the researchers may have been working with sheep extracts all along, says Danny Matthews, chief of spongiform encephalopathy research at the U.K.'s Veterinary Laboratories Agency in Weybridge. “My interpretation is that they were actually strain-typing a pool of [sheep] brains and that at some subsequent point the study was compromised,” he says. Taylor agrees: “These results could not have been obtained from infected cow brains. They would have seen different incubation periods and [pathology] profiles.”

    Whatever the explanation, Bostock says that the institute was given no time to try to figure out what had happened before being exposed to public humiliation. He and other IAH scientists still fume at FSA's decision to announce on its Web site last August that the unfinished study was under way, before the possibility of contamination had been eliminated.

    But Krebs, FSA's chair, defends the decision to go public. Early announcements are “tricky,” he says. But if the results had confirmed BSE in sheep, “no one would have doubted that we did the right thing by telling people at a time when the government was promoting the consumption of lamb.”

    Bostock and other researchers argue that the study could not have determined whether lamb is safe to eat today. Only ongoing studies on the current sheep flock can answer that question. “What matters to us in the U.K. is whether BSE is present in the sheep population now,” says James Ironside of the National CJD Surveillance Unit in Edinburgh. But the now-discredited study could have helped fine-tune estimates of how many people might have become infected with BSE in the early 1990s from eating sheep—and thus might eventually contract vCJD. Unless the mystery of the wayward brains is cleared up—for example, by IAH's internal investigation—we may never know the answer.