News this Week

Science  12 Jul 2002:
Vol. 297, Issue 5579, pp. 170

    Agreement Unlocks Loan for TB and AIDS Treatment in Russia

    1. Paul Webster*
    1. Paul Webster is a writer in Moscow.

    MOSCOW—After more than 3 years of wrangling and delay, a $150 million loan from the World Bank designed to tackle Russia's burgeoning AIDS and tuberculosis epidemics might at last be on the verge of approval. It has been held up because Russian officials have refused to accept the TB treatment scheme prescribed by Western agencies. But in the past few weeks, negotiators from the World Bank, the World Health Organization (WHO), and the Russian Ministry of Health have apparently settled their differences. “The discussions have now been completed. We've agreed on the basic issues,” says Tatyana Loginova, health officer for the World Bank in Russia. “All we're waiting for now is the government to approve the loan.”

    Speed and thoroughness are the watchwords of successful TB treatment, but Russia's approach has faltered, say observers such as Vinciane Sizaire, a TB adviser for Doctors Without Borders, which operates a TB treatment program in Siberia. “The approach has been erratic since the early 1990s,” Sizaire complains. As a result, the number of TB cases has soared, reaching epidemic levels in the late 1990s. Government figures suggest that there are 133,000 newly diagnosed cases of TB in Russia every year, with as many as 100,000 of them concentrated in prisons, where almost 30% of patients are infected with drug-resistant strains.

    According to Sizaire, the explosion of drug resistance in Russia is largely the result of poor medical management, scarce drugs, and unfinished courses of treatment. Valentina Shishkina of the Red Cross in Russia, which treats 10,000 patients with a budget of only $500,000, estimates that up to 40% of TB patients get irregular treatment, which allows drug-resistant strains to flourish. “The government's approach until recently has been that they don't know exactly how to deal with this,” she says. Mikhail Perelman, the Russian government's top TB adviser, acknowledges that scarce resources have been a major problem and that TB mortality rates are “unacceptable.”

    Help needed.

    Patients in a Russian TB prison. About 100,000 Russian prisoners have TB, and drug resistance is a growing problem.


    Russia is also struggling with a rising tide of HIV infection and AIDS. Official figures indicate that the number of HIV infections has almost tripled to 200,000 in the past 3 years. But government researchers say many cases are not reported and the real figure is close to 700,000. The World Bank recently warned that the epidemic could reduce economic output in Russia by more than 10% by 2020.

    In the late 1990s, the Russian government requested help from the World Bank in dealing with these twin epidemics, and a loan was agreed on in 1999. But Russian officials, including Perelman, advised the Ministry of Health to reject the loan because it stipulated the use of a TB treatment procedure advocated by WHO known as Directly Observed Therapy Short-Course (DOTS), which requires a very closely monitored antibiotic regimen. The DOTS approach has been adopted by 148 countries and has been especially successful in China and India, where one-third of the world's TB cases reside.

    But DOTS conflicts with Russia's traditional approach to TB treatment, which relies heavily on mandatory inoculations, x-ray diagnosis, and isolation of patients in sanatoriums, sometimes for years. Paul Farmer of Harvard Medical School in Boston, who helped design the medical program for the World Bank, said the Russian objections to updated approaches were “frustrating.” The stalemate over how to tackle TB, moreover, held up funding from the same loan for AIDS therapies.

    There was more to the dispute than simply a culture clash, however. Last year, Perelman said that World Bank stipulations would force Russia to buy drugs from foreign companies, a charge Russian drug company executives had raised in parliamentary hearings. “Russia wanted economic benefits for the drug industry,” despite the fact that Russian drug companies lack the capacity to produce the numerous drugs used in the DOTS treatment, says Sizaire. Olusoji Adeyi, the World Bank's senior health specialist for Russia, says, however, that Russia would be free to use home-produced drugs. “There are no conditions imposed by the World Bank at issue,” he says. “There are no guidelines for drug procurement.”

    Russian officials have finally accepted the DOTS approach. Mikko Vienonen of the WHO Directorate in Russia says that last week's agreement followed a lengthy “process of understanding” that was delayed by numerous administrative hurdles. The Russian Ministry of Health will now recommend that the government pass a special prikaz, or order, to harmonize TB treatment with the DOTS methodology. “We are practically almost there,” says Vienonen, although he cautions that full government approval will be needed before the World Bank money starts flowing and TB and AIDS patients see any benefit. Vienonen has no idea how long that final hurdle will take to overcome: “I've been in Russia for 3 years working on this,” he says. “I've stopped making predictions.”


    Nudge From Congress Prompts NCI Review

    1. Jocelyn Kaiser

    After receiving a complaint from Congress, the National Cancer Institute (NCI) has removed a fact sheet from its Web site discussing abortion and cancer risk, pending a scientific review of the information it contained.

    On 7 June, Representative Chris Smith (R-NJ) and 27 other abortion opponents wrote to Tommy Thompson, secretary of the Department of Health and Human Services (HHS), deploring revisions NCI made to its fact sheet in March. The institute reported that recent studies indicate that having an abortion does not appear to increase a woman's risk of developing breast cancer. NCI director Andrew von Eschenbach ordered it removed on 19 June and asked several NCI divisions to prepare reviews of the science. An HHS spokesperson says that Thompson “never discussed” the letter with von Eschenbach, even though the two had one of their regular meetings a few days after it was received.

    The controversy concerns a murky issue in epidemiology. Several studies before the mid-1990s found an association between induced abortion and breast cancer, says Robert Hoover, director of NCI's epidemiology and biostatistics program. But, he adds, “there was a lot of concern about the methods.” The chief problem was that the studies relied on interviews, and researchers suspected that women with breast cancer might be more likely than healthy women to report having had an abortion. Then in 1997, a large study based on Danish health records of abortions—not self-reports—found no increased cancer risk. In its 1999 fact sheet, NCI concluded that the overall evidence was “inconsistent.”


    An NCI fact sheet minimizing abortion-related breast cancer risk has been pulled for review.

    The institute came to a firmer conclusion in March, however. Citing the Danish study and four newer ones, NCI stated in a revised fact sheet that “the current body of scientific evidence suggests that women who have had either induced or spontaneous abortions have the same [breast cancer] risk as other women.” The lawmakers called this a “glossing over” of the evidence and said that the fact sheet is “scientifically inaccurate and misleading to the public.” Epidemiologist Karin Michels of Harvard Medical School in Boston disagrees: Although the lawmakers' letter raised one valid point about the Danish study—that it might have missed abortion records for some women—she says, an attached analysis contained “many incorrect statements.” She thinks the March fact sheet “was fine.”

    Some antiabortion activists, maintaining that the self-report studies are valid, have mounted a campaign in some states to get legislation passed requiring clinics to inform women about them. “This is a key weapon in the antiabortion arsenal,” says Elizabeth Cavendish, legislative director of the National Abortion and Reproductive Rights Action League in Washington, D.C.

    Hoover acknowledges that “there have been differences of opinion” about how much weight to give the self-report studies. NCI officials note that several studies under way should help resolve the debate.

    Complaints about fact sheets aren't unusual, Hoover says; “once or twice a year,” Congress or consumer groups complain about NCI's positions on thorny topics such as how often women should get mammograms. He adds, “I hope we can get past this [latest request] and move on.” Meanwhile, HHS spokesperson Bill Hall says, “if it's determined that there are no inaccuracies in the [NCI fact sheet], it will go back up the way it is.”


    First Member of Human Family Uncovered

    1. Ann Gibbons

    Paleontologist Michel Brunet has excavated thousands of fossils—elephants, crocodiles, apes, and hominids—from rich beds in Afghanistan, Pakistan, and Chad. But last summer his good luck turned pure gold: A sharp-eyed undergraduate member of the French-Chadian team spotted the skull of a primate on the sandblasted floor of Chad's Djurab Desert. And when Brunet looked at its ancient face, he recognized the find of a lifetime.

    Featured on the cover of this week's issue of Nature, the partial skull is now described as that of the oldest known hominid, the lineage that includes humans but not other apes. It is dated to 6 million to 7 million years ago and so fills in a crucial gap at the dawn of human evolution, when next to nothing is known. The next oldest published hominid skull is almost 3 million years younger.

    Paleoanthropologists are stunned by the new skull's antiquity and surprising mix of apelike traits and hominid features. “It is a monumental discovery,” says paleoanthropologist Daniel Lieberman of Harvard University. “It is unquestionably one of the great paleontological discoveries of the past 100 years.”

    Not only are the skull's features surprising, but it was discovered in an unexpected place: the ancient shore of Lake Chad in western Africa. Most other early hominid fossils have been uncovered in eastern Africa, notes Brunet of the University of Poitiers, France. The new skull “is a major opening window for understanding the origins of hominids,” says Tim White, a paleoanthropologist at the University of California, Berkeley, who has seen casts.

    Surprising skull.

    The first hominid was found in Chad.


    The six Chad fossils, which include a nearly complete cranium, two lower jaw fragments, and three isolated teeth, show a unique combination of features, prompting Brunet and his international team to assign them to a new genus and species, Sahelanthropus tchadensis. Nicknamed Toumaï, which means “hope of life” in the Goran language of Chad, it was found in beds where shifting sand dunes have exposed thousands of fossils of crocodiles, elephants, and other creatures. Brunet's team compared the mix of extinct species from the site to fauna at other reliably dated sites in Africa and concluded that the cranium is 6 million to 7 million years old, a date that is “apparently on solid ground,” says White. That age pushes the limits of many molecular studies dating the split between the hominid and chimpanzee lineages to 5 million to 7 million years ago (Science, 15 February, p. 1217).

    In keeping with its age, the skull looks most like that of an ancient ape, with a brain the size of a chimpanzee's, large incisors, and widely spaced eyes like a gorilla's. But the shape and size of its canines and lower face resemble those of human ancestors that came later. When Lieberman first saw a cast of the skull, for example, he was stunned by the face. Chimpanzees and other apes have protruding lower faces. But this cranium shows “a short, vertical face” with less of a snout, says Lieberman. “These cranial features astonish me, because they are not present in many early australopithecines. It is in some respects more hominidlike than Australopithecus afarensis”—the species of the famed “Lucy” skeleton, dated to 3.4 million years ago and long considered an early human ancestor.

    The teeth, too, are hominidlike. Although the team thinks Toumaï was probably a male because of its thick brow ridge, the canines are relatively small and unsharpened, unlike those of other great apes; many consider this trait one of the first to define hominids (along with bipedalism). This change might reflect a shift in social structure toward less competition for females, because big, sharp canines are found in males who compete furiously for mates, says Carol Ward of the University of Missouri, Columbia. Also like hominids, Toumaï has no space between its canines and incisors. “It is a dental hominid,” says White.

    Desert treasure.

    Chadian and French researchers sift through rich fossil beds in the Djurab Desert.


    The skull's unique mosaic of characters invites speculation about its relationships to the two other contenders for first hominid: Ardipithecus ramidus, which lived in Ethiopia 4.4 million to as early as 5.8 million years ago, and Orrorin tugenensis, which lived in Kenya about 6 million years ago. Brunet raises the possibility that Toumaï and A. ramidus might be “sister groups.” White goes even further: “An argument could be made that [Toumaï] is an earlier species of the same genus.” Comparisons are more difficult with Orrorin, which is so far known chiefly from teeth and limb bones rather than a skull, but Orrorin's V-shaped canines are more chimplike than those of Toumaï and Ardipithecus.

    Outside researchers do offer a caveat. If the new skull is from a female rather than a male, the canines are “less striking” and more in line with those of living and extinct apes, says Ward. Brigitte Senut of the National Museum of Natural History in Paris, co- discoverer of Orrorin, says, “When we saw the specimen, we thought, ‘This is a female.’” She suggests that Toumaï might be an early gorilla rather than a hominid. This debate could be settled if Brunet's team finds skeletal bones that show that Toumaï was bipedal—and hence a hominid.

    In the meantime, some researchers say, the skull is so different from that of other known apes that it might represent a sliver of great diversity of ancient hominids—most of which have not been found. “This fossil is a huge wake-up call,” says Lieberman. “It reminds us that we're missing large portions of the fossil record.”


    AIDS Researcher Named CDC Chief

    1. Eliot Marshall

    Julie Gerberding, an infectious-disease researcher who rose to public prominence last fall as a spokesperson for the U.S. government's response to the anthrax crisis, has been named director of the Centers for Disease Control and Prevention (CDC) in Atlanta. Gerberding, 46, is a CDC insider. She was recruited to CDC in 1998 from the University of California, San Francisco (UCSF), to run a national program aimed at controlling hospital infections and medical errors and was promoted to her new job from acting deputy director for science.

    Making the announcement on 3 July, Secretary of Health and Human Services Tommy Thompson praised Gerberding for having “the right mix of professional experience and leadership skills” to run the agency as it focuses on new infectious disease threats. The appointment does not require congressional approval. Nevertheless, Senator Edward Kennedy (D-MA), chair of the Senate panel that reviews health policy, chimed in last week that Gerberding is “a strong public health leader” and “a superb choice.”

    Gerberding, CDC's first female director, appeared in televised briefings and in congressional hearings last fall to explain how the anthrax bacterium causes infection and how to guard against it. “She's a great teacher,” says a former UCSF colleague, Paul Volberding, adding that CDC should benefit from her public communication skills. He also notes that during the 1980s and 1990s, Gerberding organized “an incredible consultation service” that worked around the clock to advise health workers and prevent the spread of HIV infection at San Francisco General Hospital.

    Inside choice.

    Julie Gerberding advances from science chief to director of CDC.


    CDC has been through a rocky period in the past 8 months, observers say, and many hope this appointment will boost morale. Some members of Congress criticized the agency for what they saw as an uncoordinated response to the anthrax mail attacks. CDC's operational response, says Tara O'Toole, director of the Johns Hopkins University Center for Civilian Biodefense Strategies in Baltimore, “was a little rusty.” CDC's most recent director, Jeffrey Koplan, resigned on 31 March, leaving the agency without a permanent chief for 3 months.

    Gerberding is “a terrific appointment,” says O'Toole: “She has great scientific credentials, she's experienced in the real world, and she knows the CDC as an insider.” James Curran, a former CDC epidemiologist who is now dean of the Rollins School of Public Health at Emory University in Atlanta, agrees: “She will be an energetic leader for CDC at a time when concerns about bioterrorism and infectious disease are paramount.” But others, such as Barry Bloom, dean of the Harvard School of Public Health in Boston, warn that any insider like Gerberding faces a big challenge. “It is time to reexamine the architecture of the CDC and its relation to the other U.S. public health agencies,” Bloom says, but it will be hard to bring order to the conflicting fiefdoms.


    NSF to Double Number of Math Institutes

    1. Barry Cipra

    American mathematics just multiplied itself by two. On 1 July, the Division of Mathematical Sciences (DMS) at the National Science Foundation (NSF) announced the creation of three new mathematical sciences research institutes, bringing the total number of such NSF-funded institutes to six. DMS's director, Philippe Tondeur, says he has “incredibly high expectations” for the institutes, which he describes as “vessels for start-up activities.”

    The new institutes will bring together mathematicians and scientists to work on problems ranging from algebraic geometry to neuronal modeling. The institutes, chosen in a nationwide competition, are the Mathematical Biosciences Institute (MBI) at the Ohio State University, Columbus; the Statistical and Applied Mathematical Sciences Institute (SAMSI), a consortium led by Duke University in Durham in collaboration with North Carolina State University in Raleigh, the University of North Carolina, Chapel Hill, and the National Institute of Statistical Sciences in Research Triangle Park; and the AIM Research Conference Center (ARCC) at the American Institute of Mathematics in Palo Alto, California. They join the Mathematical Sciences Research Institute at the University of California, Berkeley; the Institute for Mathematics and Its Applications at the University of Minnesota, Minneapolis; and the Institute for Pure and Applied Mathematics at the University of California, Los Angeles. MBI and SAMSI will each receive $10 million from NSF over the next 5 years; ARCC is slated for $5 million.

    Castle on a hill.

    The American Institute of Mathematics' Research Conference Center in Morgan Hill, California, will host focused workshops.


    MBI will kick off with a yearlong program on neuroscience, including neuronal modeling of olfactory, auditory, and sensory-motor systems. “The mathematical sciences proved valuable in completing the genome project,” notes MBI director Avner Friedman. “The promise of the future is even greater.” SAMSI has programs lined up on statistical aspects of environmental modeling and inverse problems. ARCC is to hold workshops on specific problems—the first, scheduled for December, will focus on algebraic geometry—and create a permanent “workshop Web site network” for each.

    “We're at an exciting juncture,” says Tondeur, who is stepping down as director of DMS this month after overseeing a dramatic 70% increase in NSF math funding over the past 3 years (from $106 million in 2000 to $182 million budgeted for 2003). Mathematics institutes are a “very low cost” way of bringing people together for focused research, he says.


    Active Poliovirus Baked From Scratch

    1. Jennifer Couzin

    With mail-order DNA and more than 2 years of painstaking work, researchers for the first time have assembled a virus from its chemical code. The lab-built poliovirus killed mice and was almost indistinguishable from the original. Biologists disagree on how difficult it would be to construct far bulkier viruses such as smallpox to create bioweapons.

    Scientists hail the research, described online this week by Science (, as a technical achievement. But in an age when anthrax travels through the mail, few could avoid the paper's obvious implications, both for polio—a disease that once triggered panicky epidemics and is now nearing global eradication—and other viral diseases. “It is a little sobering to see that folks in the chemistry lab can basically create a virus from scratch,” says James LeDuc, director of the Division of Viral and Rickettsial Diseases at the Centers for Disease Control and Prevention in Atlanta. Vincent Racaniello, a virologist at Columbia University in New York City, was more blunt. “Poliovirus,” he says, “will never be gone.”

    A genomic runt at just 7741 bases, poliovirus is composed of a single strand of RNA and ranks among the most thoroughly dissected viruses of all time. Once it infects a cell, the RNA translates itself into a large protein, which is then cleaved to produce a cluster of smaller ones. Those proteins attack critical cells such as neurons in the central nervous system.

    The researchers—Jeronimo Cello, Aniko Paul, and Eckard Wimmer of the State University of New York, Stony Brook—built an almost perfect replica of the virus, reading the recipe available in a public database of the letters that make up the virus's chemical code. Because RNA is chemically unstable, the scientists converted the RNA sequence to DNA, replacing every uracil base with a thymine. Then they ordered short stretches of carefully arranged bases from one of the many companies that churns out such piecemeal DNA. Cello took about a year to layer these fragments together to form the first third of the virus. Once he established that these stretches stayed oriented correctly, he hired a DNA synthesis company to assemble the remaining portion, which it did in 2 months. To distinguish the synthetic virus from natural strains, the group inserted 19 markers, minor mutations that weren't expected to alter polio's behavior.

    According to plan.

    Poliovirus reconstructed from its genetic sequence is indistinguishable from the original, shown here.


    DNA in hand, the researchers immersed it in enzymes to convert it back to the RNA at polio's core. The artificial poliovirus acted much like its natural counterpart: It multiplied, and antibodies could block it from entering cells. Mice injected with the synthesized virus became paralyzed after about a week, as did animals infected with normal poliovirus. But the artificial version was less lethal: Between 1000 and 10,000 times more virus was needed to kill an animal. The team suspects that one or more of the 19 markers are hobbling the virus.

    The research might throw a wrench into polio eradication plans. “It erodes the underpinning of the whole eradication concept,” says Peter Jahrling, a smallpox researcher at the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland. Last month the World Health Organization (WHO) announced that it had erased the disease from the European continent, and, according to Bruce Aylward, WHO's coordinator of the Global Polio Eradication Initiative in Geneva, “the goal is to stop immunization” once the disease is fully eradicated. But given the possibility of recreating the virus, researchers who favor continued immunization, such as Donald A. Henderson, an adviser to the U.S. government on bioterrorism policies, hope that WHO will reconsider its stance.

    Then there's the question of whether one could reconstruct other pathogens whose sequences are publicly available. Smallpox, among the most feared bioterror agents, is far more massive than polio at 185,000 bases and far more complex. LeDuc, for one, doesn't believe that rebuilding it is imminently doable. But given the new results, others aren't so sure. “In principle, yes, [it's] possible to synthesize smallpox,” says Vadim Agol, a virologist at the Russian Academy of Medical Sciences in Moscow.

    Despite such nightmarish scenarios, scientists have no plan to stop posting new genetic sequences online. Wimmer says that no concerns were raised to him about publishing the paper. As Cello says, “By releasing this you alert the authorities … [about] what bioterrorists could do.”


    Stem Cells Not So Stealthy After All

    1. Gretchen Vogel

    Human embryonic stem (ES) cells get no free pass from the immune system, contrary to some researchers' early hopes. As the cells differentiate, they express increasing levels of the telltale tags the body uses to distinguish between native and foreign cells. The findings, published online this week by the Proceedings of the National Academy of Sciences, confirm that a patient's immune system would be likely to reject transplanted tissues derived from ES cells. Scientists hoping to use the cells to treat Parkinson's disease, diabetes, and other maladies will therefore have to find ways to reconcile the body's defense system with the transplanted cells.

    Earlier evidence from human embryos raised the slim but tantalizing possibility that ES cells might be “immune privileged,” unrecognizable by the body's defenses against foreign cells. One study reported that the embryo cells that give rise to ES cells do not express the so-called MHC proteins that help the immune system identify an invader; another produced inconclusive results. That led some researchers to hope that transplanted tissue derived from ES cells might remain under the radar of the immune system.

    Although the new results are not unexpected, they lay that hope to rest, says Hugh Auchincloss, a transplant surgeon at Harvard Medical School in Boston, who notes that he has frequently heard scientists claim that MHC proteins aren't expressed in ES cells. “As simple as the data are here, it's something that we didn't know before,” he says.

    Out of hiding.

    As they differentiate, human ES cells express increasing levels of immune system proteins.


    Graduate student Micha Drukker, cell biologist Nissim Benvenisty, and their colleagues at the Hebrew University of Jerusalem looked for MHC molecules in three human ES cell lines. Two were derived at the University of Wisconsin, Madison, and the third at Monash University in Melbourne, Australia. The team used a fluorescent-tagged antibody that attaches to MHC molecules and measured the amount of fluorescence that showed up in three stages of ES cell development: undifferentiated ES cells; so-called embryoid bodies that form as ES cells begin to differentiate; and teratomas, which are tumors formed by differentiated ES cells. As a control, they also tested a non-ES human cell line called HeLa.

    The team found very low, but consistent, expression of MHC class I molecules on the undifferentiated ES cells. However, as the cells differentiated, they produced higher levels of the proteins. Although the levels are not as high as in the HeLa cells, they are high enough that they would probably trigger an immune reaction, says J. Andrew Bradley, a transplant surgeon at Cambridge University, U.K.

    Even though ES cells aren't invisible to the immune system, scientists have several potential avenues around the problem of transplant rejection. They could genetically alter ES cells so that MHC proteins would not be expressed, build up a cell bank—similar to a blood bank—of cells with a range of MHC profiles, or use nuclear transfer techniques—better known as cloning—to create genetically matched ES cells for individual patients.

    But each approach has its drawbacks. “It is hard to imagine an ES cell line bank that would have a match for all patients,” Auchincloss says. Genetically altering ES cells to develop a “universal donor” cell that would not express MHC proteins is not only technically difficult but could leave the resulting tissue more susceptible to infections and tumors—two things MHC molecules help the body fight against. And deriving genetically matched stem cell lines for individual patients using nuclear transfer techniques is not only controversial but would likely be too expensive for treating large numbers of patients, says Bradley.

    The news is not all bad, Bradley notes. The relatively low levels of MHC expression might at least mean that tissues derived from ES cells would be less prone to rejection than today's whole-organ transplants.


    Diamond Dust Dearth Raises Doubts

    1. Richard A. Kerr

    Most experts agree that the solar system's most ancient rocks from asteroids and comets should be sprinkled with microscopic diamond dust, a remnant of ancient stars. The less altered the rock since the gas and dust of the solar nebula came together, the more star dust should survive. But a group of researchers reported this week that at least some of the most primitive, unaltered rock in the solar system contains no diamond star dust at all. The finding raises questions about just how star stuff came to form the solar system. “It really was an unexpected result,” says microscopist Lindsay Keller of NASA's Johnson Space Center in Houston, who was not in on the (non)discovery. “Why nanodiamonds are not there is uncertain.”

    During the past few decades, researchers have found interstellar dust grains in some less altered meteorites by doing what one cosmochemist called “burning down the haystack to find the needle”: dissolving a meteorite until only the hardy mineral bits condensed in the atmospheres of stars long ago—silicon carbide, graphite, and diamond—remain. The diamond flecks are so small (3 nanometers in diameter, on average) that a single grain might contain just a couple of thousand carbon atoms. Three years ago, microscopist Zurong Dai of Georgia Institute of Technology in Atlanta and his colleagues decided to extend the diamond hunt to microscopic interplanetary dust particles (IDPs) that flaked off asteroids and comets and now sift down through the stratosphere.

    Sky with diamonds.

    Rather than star dust, nanometer-sized diamonds may be a product of the newborn sun.


    IDPs are too small for the “burn down the haystack” approach, so in an analytical tour de force Dai and his colleagues exposed the nanodiamonds by careful acid dissolution and identified them by measuring their distinctive atom-to-atom distance under high-resolution transmission electron microscopy. As they reported in this week's issue of Nature, they found plenty of nanodiamonds in two famous primitive meteorites—Murchison and Orgueil—as well as in two primitive micrometeorites retrieved from antarctic ice and four large, “cluster-type” IDPs from the stratosphere. But they found none in five smaller IDPs, although they were just as compositionally primitive as the cluster IDPs—and therefore also presumably came from the outer parts of the solar system, where stellar nanodiamonds are most likely to have survived. “We should have found nanodiamonds in every [sample] we looked at, but we didn't,” says cosmochemist John Bradley of Lawrence Livermore National Laboratory in California, a co-author on the Dai paper. “That's puzzling.”

    The absence of nanodiamonds in half the IDPs examined has “no easy explanation,” says Bradley. The simplest answer, the group writes, would be that most nanodiamonds were not formed around ancient stars at all but in the inner parts of the disk-shaped solar nebula as the solar system formed. That could leave detectable numbers of nanodiamonds in IDPs that formed closer in but not in more distant ones. The catch is that, if popular theories about chemical conditions in the early solar system are correct, diamonds shouldn't have been able to form there.

    Alternatively, the asteroids or comets that produced the smaller IDPs might have been altered enough to have lost their nanodiamonds. If such bodies turn out not to be primitive, meteoriticists will lose one of their main sources of information about the formation of the solar system—a sacrifice they would hate to have to make.


    Animal Studies Raise Hopes for Spinal Cord Repair

    1. Ingrid Wickelgren

    Researchers are suddenly confident they can make severed neurons grow, but can they translate laboratory finds into cures?

    In the summer of 1993, doctor-turned-entrepreneur Ron Cohen did something many of his colleagues considered a little nutty. He decided to set up one of the first companies—closely held Acorda Therapeutics in Hawthorne, New York—dedicated to developing treatments for spinal cord injury. At the time, it was not even clear that any such therapy was plausible, let alone ready for commercial development.

    But Cohen was determined. His father had been a neurologist, so Cohen knew how desperate the world is for treatments for central nervous system disorders. Spinal cord injuries leave some 10,000—usually young—people permanently paralyzed every year in the United States alone; the total number of people with such damage is about 250,000.

    Nearly 10 years later, Cohen's investment doesn't look so crazy. In the past few years, scientists have proved that they can regenerate spinal nerves, at least in rodents, enabling the animals to walk more normally and regain some primitive forms of sensation. Nearly a dozen research teams have recently reported in prominent journals ways of promoting such regeneration. And although much attention has focused on using stem cells to repair damaged spinal cords (Science, 3 December 1999, p. 1826), researchers are exploring many other options.

    For example, some researchers have discovered specific growth inhibitors and blocked them, allowing neurons to grow and make new connections. Others have unveiled clever ways of building cellular scaffolds across the injury site that encourage neurons to traverse the damage, or chemicals that spur growth when injected into neurons.

    “The explosion of findings has offered a sense of optimism I never expected to see,” says Barbara Bregman, a spinal cord researcher at Georgetown University in Washington, D.C. Adds Geoffrey Raisman, a neuroscientist at the National Institute for Medical Research (NIMR) in London: “I've been in this field for 30 odd years, and for the first time, I'm beginning to think we're in reach of a treatment.”

    But despite the progress in the lab, researchers caution that, with possibly one or two exceptions, it will be at least a few years before these therapies are tested in human patients. Both the scientific and practical hurdles are immense, especially because many spinal cord therapies might require delicate surgery or the ability to successfully deliver protein molecules into the tortuous, largely unfamiliar terrain of the spinal cord. Developing such treatments also requires both financial resources and expertise that most basic researchers lack.

    With a new facility for large-scale testing in animals and a drug just now entering advanced clinical trials, Cohen's firm is ready to fill this gap. Acorda sees potential profits in treatments for chronic spinal cord injuries, because most patients live 40 to 50 years after they are injured. This not only means that they might take a medication for decades but also that they—or their health insurance companies—might be willing to pay handsomely for a drug that helps offset individual, lifetime medical costs of $400,000 to $2.1 million.

    Bridging the gap.

    Olfactory cells (red) transplanted into the damaged spinal cord of a rat encourage the regrowth of neurons (green).

    SOURCE: Y. LI, P. M. FIELD, G. RAISMAN, SCIENCE 277, 2000 (1997)

    But Acorda remains an exception. Most pharmaceutical companies are not interested in pursuing treatments for an ailment that affects so few compared to, say, cancer and heart disease. So it's still unclear who will develop most of these potential treatments, a problem the U.S. government and nonprofit organizations are trying to solve (see sidebar).

    Razing roadblocks

    Such a quandary exists thanks to rapid scientific advances in understanding the injured spinal cord. In the wake of an accident—say, a car crash, fall, or gunshot wound—inflammatory processes inflict additional damage on the cord, including severing nerve fibers that survived the original trauma. To curtail this spreading damage, in the early 1990s neurologists began treating spinal cord injuries with injections of the anti-inflammatory steroid methylprednisolone. But that drug must be administered within 8 hours of an accident, and it has only a modest effect at best.

    So scientists searched for better therapies, in particular those that can help the many patients with preexisting injuries. They have discovered that inflammatory processes not only directly damage the cord but also inhibit its recovery by blocking nerve regeneration. In particular, neural supporting cells known as astrocytes deposit scar tissue that blocks neuronal outgrowth both chemically and mechanically.

    In the early 1990s, Jerry Silver of Case Western Reserve University in Cleveland and his colleagues identified the main inhibitory component of the scar as glycoproteins called chondroitin sulfate proteoglycans. Silver's team showed that nerve cells growing in a dish stopped and turned away when they encountered a streak of proteoglycans. However, when the researchers added a bacterial enzyme called chondroitinase ABC that chops off the sugary branches of these glycoproteins, the nerve fibers grew where they wouldn't before.

    Not until this year, however, did researchers find out how the enzyme would perform in an animal model of spinal cord injury. In work described in the 11 April issue of Nature, Elizabeth Bradbury of Kings College London and her colleagues infused chondroitinase ABC into rats immediately after they had partially clipped the animals' spinal cords with forceps. The researchers found that the damaged nerve fibers regenerated and made functional connections across the injury site—connections that seemed to improve the rats' motor skills.

    The treated rats took longer strides than untreated controls, which walked with a short, choppy gait, and regained finer sensory-motor skills such as the ability to traverse a grid and a narrow beam. However, the animals still failed to detect a piece of tape stuck to their paw—something normal rats would immediately strip off—because the axons governing conscious sensations did not grow far enough to reach their brain targets.

    Although chondroitinase alone is unlikely to get people in wheelchairs to walk, it “might be part of some overall strategy for treatments in humans,” Silver says. Adds Bradbury: “Each time we find something like this that we know we can overcome, it's very exciting.” However, the team has not yet explored whether the enzyme will aid recovery from long-standing spinal cord injuries. Also unknown is whether it might have any undesirable side effects.

    The glycoproteins targeted by chondroitinase are produced in response to injury, but researchers have found that the normal cord also makes compounds that inhibit neuronal growth. One of these is a protein called Nogo, discovered in the 1980s by Martin Schwab's group at the University of Zürich, Switzerland. The researchers found that Nogo is produced by the insulating sheath of myelin that surrounds all spinal nerve fibers and facilitates their ability to conduct neural impulses.

    In healthy animals, Nogo might help cement the proper neuronal connections established during development by inhibiting further growth. But the downside is that Nogo also inhibits neuronal sprouting after injury. Schwab's team is now working to find ways of blocking Nogo's effects. In 1995, the Zürich researchers showed that a Nogo antibody helps spawn connections that enabled rats with damaged spinal cords to walk more gracefully, with better balance and longer strides.

    More recently, in 2000, the Schwab team—as well as Stephen Strittmatter's group at Yale and Frank Walsh and his colleagues at GlaxoSmithKline—cloned the Nogo gene, including its human version. This has enabled researchers to produce large amounts of the human protein, providing fodder for antibody production. It has also helped them identify the most active parts of Nogo, which are likely to be more precise antibody targets.

    Despite the apparent lack of interest in spinal cord injury by most big pharma companies, Nogo has drawn some attention. A little over a year ago, Novartis licensed Schwab's Nogo antibody technology. The company is motivated, at least in part, by the idea that Nogo might prove useful in other neurological conditions such as multiple sclerosis, Parkinson's disease, and stroke, which affect far more individuals than spinal cord injuries.

    Meanwhile, Strittmatter's group last year identified a receptor through which Nogo exerts its effects. In the 30 May issue of Nature, the Yale team showed that a small peptide portion of Nogo that gums up, rather than activates, the receptor could induce both neuronal regrowth and functional recovery in rats with spinal cord injuries. The results identify the receptor as a possible target for a small-drug Nogo inhibitor. Such an inhibitor would be attractive for pharmaceutical firms because it might be swallowed as a pill rather than needing to be pumped into the spinal cord, as proteins and antibodies do.

    Indeed, recent data suggest that blocking the Nogo receptor might be even more effective than targeting Nogo itself. On 27 June, Strittmatter's team published evidence on Science online showing that a growth inhibitor called myelin-associated glycoprotein (MAG) binds to the Nogo receptor. In work published online by Neuron on 28 June, Marie Filbin and her colleagues at Hunter College of the City University of New York provide further support for the idea that MAG acts through the Nogo receptor. In addition, Harvard's Zehgang He and his colleagues reported in the 27 June issue of Nature that this receptor is also an attachment site for a third myelin-derived inhibitor, oligodendrocyte myelin glycoprotein. Thus, blocking the Nogo receptor might help thwart all three inhibitory influences.

    Another more futuristic possibility for a small-molecule therapy also comes from Filbin's team. Previous work by that group showed that cyclic AMP, a molecule involved in the cell's internal signaling pathways, fosters nerve cell growth by overcoming growth inhibitors such as Nogo and MAG. The researchers are still working out exactly how this occurs. But in experiments described in the 13 June issue of Neuron, Filbin's team, and one led by Marc Tessier-Lavigne of Stanford University and Allan Basbaum of the University of California, San Francisco, showed that cyclic AMP injected into the cell bodies of certain rat spinal neurons before an injury induces regeneration of the branches of neurons that lead to the brain. Filbin predicts that the injections will also work when given after an injury.

    View this table:

    Building a bridge

    When the damage is severe, blocking inhibitors will probably not be enough, however. Regenerated neurons would still have to cross a difficult barrier: large, fluid-filled pockets that are produced by inflammatory processes in addition to dense, rubbery scar tissue. Work by NIMR's Raisman and his colleagues indicates that neurons can be coaxed across such forbidding territory with a bridge of tissue taken from the nose. Olfactory neurons spontaneously regenerate whenever they are damaged—say, by a cold virus or strong solvent that was inhaled—and grow into the brain to make connections that are necessary for the sense of smell. Raisman discovered that these cells grow by extending their axons across a scaffold of supporting cells unique to the olfactory system.

    In the mid-1990s, Raisman and his colleagues transplanted these so-called olfactory ensheathing cells into rats with spinal cord injuries. They found that the transplants not only coaxed nerve fibers to extend across the lesion, but they also restored the rats' ability to use their paws to reach out and grasp a piece of food. And in as-yet-unpublished work, the group has discovered that the transplants can induce severed neurons to create connections needed for climbing, a very complex movement involving the entire rodent body. What's more, the scaffold works even if it is built 6 months after the injury.

    Now Raisman is implanting human olfactory ensheathing cells into rats to determine whether they have similar properties. He is already sketching plans for small clinical trials, to be conducted in collaboration with neurosurgeons in London.

    Other types of cells, including stem cells, might also form effective scaffolds. Lars Olson of the Karolinska Institute in Stockholm, Sweden, and his colleagues have achieved promising results with bone marrow stromal cells, a kind of stem cell that can turn into cartilage and bone. His team transplanted the cells into rats with spinal cord injuries and showed that the cells coalesce into bundles that extended across the scar, creating an environment that encouraged neuronal growth. If grafted 1 week after the injury, the cells also helped the rats recover crude walking skills. (The results appeared in the 19 February Proceedings of the National Academy of Sciences.)

    Plugging holes

    Despite the focus on regeneration, rebuilding cut nerves might not always be necessary for promoting significant recovery of the injured spinal cord. More than half of human spinal cord injuries are incomplete, severing some but not all of the cord's fibers. In these cases, improving the function of the remaining fibers might be an equally promising route. That's the tack being taken at Acorda. The company's chief of research, Andrew Blight, discovered in the 1980s that the myelin sheaths of the spared fibers often sustain damage that impairs their ability to conduct neural impulses.

    Channel blocker.

    Potassium leaking out of demyelinated neurons decreases their ability to conduct impulses (top). The drug fampridine plugs the exposed potassium channels, thereby restoring electrical conductivity (bottom).


    One of myelin's jobs is to cover channels that would otherwise allow potassium ions to flow out of neurons. This prevents current from leaking from the cells and impeding conduction of the neural impulse. Acorda has developed a drug called fampridine (chemically, 4-aminopyridine, or 4-AP) that helps compensate for gaps in the myelin by directly blocking the potassium channels at those gaps. Blight and his team, then at New York University, showed that giving 4-AP to cats with spinal cord injuries restored the ability of surviving neurons to conduct electrical impulses and stimulate a normal pattern of electrical activity in the cats' muscles. In 1991, they showed that the treatment could improve standing and walking ability as well as bladder and sensory function in pet dogs that had become paralyzed in car accidents or after ruptures of their spinal discs.

    Acorda began trials in humans about 5 years ago and has since treated more than 200 patients. Fampridine improved the patients' sensory and motor functions only modestly. But the drug significantly decreased spasticity, or stiffness and involuntary jerking of limbs, in some patients. It also improved bladder, bowel, and sexual function in treated individuals compared to controls. So far, the only significant side effect is a small risk of seizure, presumably because the compound also increases the excitability of healthy neurons. Large-scale human trials of the drug began in early June.

    In the future, Cohen hopes his company will help develop new medications that provide more than partial benefits. Acorda has now constructed what he says is the biggest animal facility in the world for testing treatments for spinal cord injury. There, researchers can test compounds on several hundred rats at a time and get clear, statistically significant answers about whether to pursue a possible remedy.

    The right combination

    Most researchers in the field believe that no single therapy will be up to the job of treating spinal cord injuries. Instead, they propose that it will take a combination of different remedies to overcome the multiple barriers to neural regeneration that scientists believe exist in the injured spinal cord. “Clearly, we're not looking for one best treatment but a combination of treatments,” Olson says. For example, a clinician might build a cellular bridge across the damaged area, administer protein growth factors to boost a neuron's intrinsic capacity for growth, and deliver enzymes that digest scar tissue. Antibodies or small molecules to neutralize the effects of inhibitory factors such as Nogo might also be added to the mix.

    Bregman and her colleagues at Georgetown University have done some of the most promising studies combining a scaffold, in this case made of fetal spinal cord tissue, with infusions of growth factors. Together, they found that this combination produced more complete neural regeneration than either growth factors or fetal tissue transplants do alone. Furthermore, last December in The Journal of Neuroscience, Bregman's team reported that delaying this combination treatment 2 to 4 weeks after an injury in rats produced even better recovery than administering it immediately. The delayed treatment enabled the rats to walk on treadmills and climb stairs, whereas immediate treatment did not.

    Such combination therapies are likely to reach the clinic later, after each individual treatment has been carefully tested alone. Indeed, given the potential dangers of such approaches, some researchers recommend extreme caution before trying any of the current experimental strategies in humans. They worry that some of the more invasive strategies might end up doing more harm than good. “It's very exciting stuff we're doing,” says W. Dalton Dietrich, scientific director of the Miami Project to Cure Paralysis at the University of Miami, “but we need to spend a couple more years [doing research] before we're ready to push something into people.”

    However, many researchers and patients do not want to wait for the perfect treatment if there is a more immediate possibility of a beneficial one. And many argue that without human tests, one will never really know whether something that improves function in rats will be of any use whatsoever in the clinic. After all, humans and rats are dramatically different—from the way they walk to the size of their spinal cords.

    For his part, Cohen is betting that his new facilities for escorting early research to human trials will attract many of the creative minds in this rapidly ripening scientific field. “It's like Field of Dreams,” Cohen says. “If you build it, they will come.”

  10. NINDS Delves Into Drug Development

    1. Ingrid Wickelgren

    The National Institute of Neurological Disorders and Stroke (NINDS) has traditionally left development of drugs for treating stroke, spinal cord injuries, and other neurological conditions largely up to the private sector. But NINDS will soon announce a dramatic departure from that policy: a new type of “translational” grant designed to bridge the gap between basic research and clinical trials.

    “This is big stuff compared to the current situation,” says Robert Baughman, NINDS associate director for technology development, who is heading the new effort. Given the explosion of promising findings related to neurological diseases (see main text) and a relative paucity of interest from the private sector, NINDS feels it is time to step in to encourage treatment development, Baughman says.

    Projects funded by the new initiative might include attempts to replicate promising experiments, toxicology studies, the development of test tube assays for large-scale screening of potential therapeutic compounds, or implementation of animal models of human diseases. NINDS also envisions supporting studies to determine the proper timing and dosages for drug administration.

    At least some basic researchers think NINDS is on the right track. “In spinal cord injury, stroke, and head injury in general, there have been major discoveries in the last 10 years that point the way to treatments,” says Oswald Steward, director of the Reeve-Irvine Research Center at the University of California, Irvine. “Somebody has to take that next step. We owe it to people like Christopher Reeve.”

    Some researchers might be concerned, however, that the institute has not set aside a specific amount of money to fund the new program and that the more applied projects will compete with basic science projects for the estimated $250 million NINDS is likely to budget for new research grants in fiscal year 2003. Indeed, because development work is pricey, and there are many good projects in the wings, Baughman predicts that translational research might eventually make up a significant fraction of the total pot.

    Peer-review committees will evaluate the translational project applications. But because many scientists do not look favorably upon applied research, the institute is creating new study section guidelines for such applications that it hopes will provide a peer-review environment in which quality translational research projects can receive competitive scores.

    In a policy more common to the private sector, investigators who are awarded the grants, which will be open to industrial as well as academic scientists, will be expected to meet specified milestones or risk losing their funding. In addition, the U.S. Food and Drug Administration has agreed to work closely with NINDS to help ensure that researchers take the steps necessary for eventual FDA approval.

    NINDS is not alone in its quest to encourage translational research. The U.K.'s International Spinal Research Trust is planning a similar initiative. The project is not yet off the ground, but the plan is to pick at least two candidate therapies and hire a project management team, consisting of people with clinical, scientific, and industrial experience who would plot a critical path for each to the clinic.

    And the Christopher Reeve Paralysis Foundation in Springfield, New Jersey, is announcing new support for scientists who wish to move beyond basic research. Reeve says the foundation is now “open for business” to fund large, multimillion-dollar grant proposals aimed at moving quickly toward, and through, clinical trials. “Why should we not challenge scientists to achieve the [medical] equivalent of the moon shot?” Reeve says. “I believe it will happen, and we have scientists who are ready to go.”


    Data Dilemma: Stow It, Or Kiss It Goodbye

    1. Erik Stokstad

    As storehouses burst with bulky samples, an NRC committee proposes a temporary cure for geology's down-and-dirty case of information overload

    When Woody's Appliance Store in Hutchinson, Kansas, blew up 17 January last year, the 20-meter-high flames immediately got it pegged as a natural gas explosion. Firefighters shut off the city supply, yet the gas and flames still roared. That night, geysers of natural gas and water began to erupt a few kilometers east of downtown. One exploded under a mobile home, killing two people.

    Suspecting that the gas had leaked from underground storage caverns, the Kansas Geological Survey (KGS) went in to figure out how the gas was moving. Within hours, survey scientists had created maps of the local geology from digitized records of thousands of wells drilled over past decades for energy exploration. Once they had fingered a particular layer of rock, the geologists went back to the survey's warehouse and dug out a continuous core of rock drilled some 40 years earlier. With this and other information, they quickly advised the gas company where to drill holes to vent the leaked gas.

    It was a dramatic step into the limelight for a dusty cylinder of rock. To geoscientists, such archived cores—bored out of rock and sediment by hollow drill bits—are standard reference tools for assessing hazards, searching for oil and other resources, and gathering an array of basic geologic information. Yet across the United States, many collections of cores and other samples are threatened by improper storage or simply being sent to the dump. The vast storehouses of data owned by oil and gas companies are especially vulnerable as industry giants wind down their exploration of the United States. Much of the data would be expensive or impossible to replace. “It's just a crime if we don't find a way to capture these data for general public use,” says Marcus Milling, executive director of the American Geological Institute (AGI) in Alexandria, Virginia.

    The problem is a “critical shortage of space for current geoscience collections and data, let alone those gathered in the future,” according to a National Research Council (NRC) report released in April.* While recognizing that not everything is worth saving, the report recommends three new $50 million government-funded centers that would rely on scientific advisory committees to figure out what should be kept. “It's a huge issue,” says paleontologist Chris Maples of Indiana University, Bloomington, who chaired the NRC committee. “I'm just stunned by how much has been lost.”

    Up to the rafters

    It's difficult to gauge the amount of geoscience data scattered among museums, state geological surveys, universities, federal agencies, and industry. The NRC committee estimates a total of 24,100 kilometers of solid rock cores and the rock chips, called cuttings, that come out of other wells. To that, they add 100 million boxes of fossils, as well as 560 million kilometers of paper logs, such as records from seismic experiments. The committee believes a quarter of these data are at risk, enough to fill the U.S. Geological Survey's (USGS's) Core Research Center in Denver, Colorado—one of the largest such facilities in the country—20 times over.

    To the rescue.

    A warehoused rock sample helped geologists solve a mysterious fire.


    Exactly how much is already gone? “We tried really hard to answer that question, and we failed,” says committee member Warren Allmon, director of the Paleontological Research Institution (PRI) in Ithaca, New York. “No one wants to admit that they pitched a collection.” Yet, anecdotal evidence suggests that a considerable amount is missing. “We hear stories all the time of companies hauling cores to the dump,” says KGS director Lee Allison.

    Confirmed losses include the core from the deepest well ever in the United States—9583 meters—drilled by an oil company between 1972 and 1974 in Oklahoma. Thrown out after a merger, the core would cost up to $16 million to replace today. And cores belonging to state geological surveys have been damaged or destroyed by earthquakes in Alaska, a collapsing building in Maine, flooding in Kentucky, and collapsed shelving in North Carolina, the committee found.

    That's a pity, geoscientists say, because old samples often find new uses. For instance, seismic records from Los Angeles, taken for oil exploration, are now used to assess earthquake hazards. And techniques such as cathode luminescence with scanning electron microscopes can coax old cores into revealing how best to extract remaining oil from reservoirs or how groundwater flows. Industry data are also a boon to academic scientists who can't afford to gather the information themselves. “This kind of material has all sorts of new science left in it,” says AGI's Milling. “That's why we've got to save it.”

    Out of room

    The trouble is finding a place to put it. Almost two-thirds of state geological surveys polled by NRC have 10% or less of their storage space available for new collections. Nearly 25% are already full. “We are bursting at the seams,” says Allison of the KGS, which just received a donation of 15,000 boxes of core from BP Amoco. “There's no money out there to build new facilities to expand.” Instead, Allison is juggling space, converting some labs to storage.

    With limited shelf space, something must get tossed for every new item added. Petroleum geologist Wayne Ahr of Texas A&M University in College Station has accepted large donations of core from oil companies. But the only place he has to put them is in a wooden barracks on a World War II airfield that's already full to the brim. In fact, it's been filled up three or four times, which brings painful choices. “Whatever we had to throw out, it's gone forever,” Ahr says. “It's heartbreaking.”

    Natural history museums are filling up, too. PRI has doubled its holdings in the past 10 years, almost exclusively by adopting collections from universities and other institutions. Now director Allmon says he turns away everything except spectacular specimens and special collections that the museum needs. “If it's a large general collection, it pains me, but we don't have any place to put it,” he says.

    Cold storage.

    Ice samples, used in climate studies, are well kept. But other samples similar to this rock core (bottom) from Hutchinson, Kansas, face an uncertain future.


    So scientists do the best they can. Allmon has a barn of his own filled with overflow from PRI, including samples he took from pits in Florida that were later flooded for a housing development. Consulting geologists fill up self-storage units with file cabinets discarded by mining companies that went bust. Many retired geologists keep samples in their basements and garages, says Susan Landon, an exploration geologist affiliated with Thomasson Partner Associates in Denver, Colorado. “They hold out hope that eventually it will find a home … where it will be useful to the community,” she adds.

    In the past decade, the problem has gotten worse, notes Edith Allison of the U.S. Department of Energy's Office of Fossil Energy. As oil companies—which hold much of the data—merged and moved their exploration overseas, they shed cores and other data gathered in North America. “There are billions of dollars' worth of data in the private sector in danger of being lost,” says AGI's Milling. “They have data from areas where you'll never be able to drill another well.” The same trend hit the mining industry.

    USGS is also feeling the pinch. Since 1995, the survey has given away almost two-thirds of its fossil collections. Staff at its core research facility, which houses more than 300,000 meters of core, has fallen from eight in 1994 to three, and storage space was reduced by 40% in 1995 to cut rent costs.

    There are a few success stories. The NRC panel holds up the National Ice Core Laboratory (NICL) in Lakewood, Colorado, as a model facility. Funded by USGS and the National Science Foundation, the lab has a Web-based catalog, well-documented cores, and a clear policy for removing materials from the collection so that little core is wasted.

    The private sector also boasts examples of good practice. When Shell donated 670,560 meters of core to the Texas Bureau of Economic Geology (BEG) in 1995, it threw in a warehouse and $1.3 million for operating costs. In return, the company received tax write-offs. “It's a good model,” says BEG director Scott Tinker, “but it has to be customized for each company.” Tinker expects to announce another major donation of a facility and 400,000 boxes of core shortly. The NRC panel suggests further incentives to encourage this kind of donation.

    Such measures, however, address just a fraction of the problem. To make a bigger impact, the NRC panel recommends that the government fund three new centers to hold core and other materials, modeled after the NICL and the core repository of the international Ocean Drilling Program. At $35 million to $50 million, each facility would cover 16,000 square meters, about the size of a Wal-Mart Supercenter. The centers would relieve the problem of data loss for 10 to 20 years, Indiana's Maples estimates.

    Mustering support for such a major investment will be difficult. “Storing rock isn't sexy,” Landon says. “It's long-term housekeeping that's always going to have trouble competing with other scientific expenditures.” Yet proponents say such large, unglamorous efforts are the only way to avert every scientist's nightmare: losing irreplaceable samples. “It's a sobering thought, and it's not hard to imagine,” Allmon says. “Even with just benign neglect, all these data could slip away.”

    • *Geoscience Data and Collections: National Resources in Peril.


    Big Facilities Account Is Big Headache for NSF

    1. Jeffrey Mervis

    Legislators are pressuring NSF to explain its procedures to researchers with large projects that have been approved but not funded

    In 1994 the National Science Foundation (NSF) wanted to find a way to keep new and expensive facilities from eating into its regular research budget. So it created a separate account and used it to fund a handful of projects, from a new South Pole station to mountaintop observatories. But less than a decade later, a growing portfolio is forcing NSF to face management challenges that it never imagined—and to defend itself against criticism by Congress, scientists, and its own internal auditor.

    In the past couple of years, big facilities have become a big headache for NSF. One problem is a backlog of projects approved for funding by the National Science Board, NSF's governing body, that haven't made it into the agency's budget. Researchers whose projects have been passed over complain that NSF has kept them in the dark about why they didn't make the cut while others did, and some have convinced members of Congress to do an end run by ordering NSF to fund specific experiments. Last month several influential U.S. senators asked the National Academy of Sciences (NAS) to review how NSF makes those decisions. If that were not enough, NSF's own inspector general (IG) recently issued a report questioning how the agency manages existing projects. NSF hopes to blunt the criticism by naming a well-regarded facilities construction chief to a new office, but so far it has been unable to hire anyone on a permanent basis.

    To NSF officials, most of the headaches could be cured with money. Its approach, using what's known as the Major Research Equipment (MRE) account, worked reasonably well when the number of projects approved by the science board roughly equaled the number that could be funded. But last year, President George W. Bush sent Congress an NSF budget that included no new starts. This year, the foundation's budget request includes $126 million for the MRE account, enough to start two projects and continue building five others. That has left four projects in limbo—approved, but unfunded—and several others close to approval, with backers wondering if they will ever get off the ground. Climate modeler Warren Washington, who chairs the science board, says that NSF needs to “double or triple” the current level of MRE funding to satisfy the community's growing hunger for cutting-edge instruments. “We're all working toward a common end, and that end is an increased budget,” says NSF deputy director Joseph Bordogna.

    Concrete ideas.

    NSF hopes to get money to build new research facilities at (top to bottom) Brookhaven, the South Pole, and the Pacific Ocean floor.


    The excess demand has, however, exposed flaws in the system. The science board doesn't prioritize the projects it approves. Until Congress last year demanded the names of all approved projects (Science, 14 September 2001, p. 1972), NSF had never publicly identified individual projects until they appeared in the agency's budget request. That secrecy bred discontent. The process appears “ad hoc and subjective,” wrote six senators in a letter to NAS president Bruce Alberts last month that also complains about NSF's failure to explain how the system works. The senators, the chairs, and ranking members of NSF's spending and oversight committees asked Alberts to appoint a committee to review NSF's priority setting.

    NSF officials bristle at such criticism. “Every project goes through an extensive review, and we are totally transparent about how this takes place,” says Bordogna. “But we'll certainly listen carefully to what the academy has to say and act accordingly.” NSF and academy officials are negotiating the terms of the study, which could be completed by early next year.

    At the same time, the agency's own watchdog is pointing to holes in how current projects are managed. On 15 May, NSF IG Christine Boesz delivered a report to the Senate panel that sets NSF's budget, warning that the current accounting system does not guard against potentially large cost overruns. “NSF's policies and practices do not yet provide adequate guidance for program managers to oversee and manage the financial aspects of major research equipment and facilities,” Boesz declared.

    The report angered former presidential science adviser John Gibbons, who in a letter to the panel accused Boesz of “harassment” of NSF director Rita Colwell. The panel's chair, Barbara Mikulski (D-MD), and ranking member Kit Bond (R-MO) rushed to Boesz's defense, however, writing Gibbons on 10 June that the IG “has acted professionally and fairly … and has played a crucial role in protecting the interests of the American taxpayer.” In its reply to the IG report, NSF defends its procedures and notes that it expects to have revised guidelines to further tighten up those practices by the fall.

    NSF hopes its new office will also improve the situation. But it failed to deliver on a promise last fall to the House Science Committee to have the top job filled by January. NSF's first choice was James Yeck, project manager for the U.S. contribution to Europe's Large Hadron Collider. An 18-year veteran of large research projects at the Department of Energy and a politically savvy outsider, Yeck could have bestowed instant credibility on the beleaguered program. But in May, for personal reasons, he turned down NSF's offer to move from Illinois's Fermi National Accelerator Laboratory to suburban Virginia.

    Yeck thinks that Boesz's criticism of the agency's accounting practices is “unfair.” But he agrees with her recommendations for improving NSF's cradle-to-grave fiscal management of large projects, including better tracking of a project's total costs and making contingency plans for any overruns.

    In the meantime, some approved projects are moving ahead without NSF's official monetary endorsement. Last year IceCube, a neutrino detector under the South Pole, received $15 million after supporters won over an influential appropriator, Representative David Obey (D-WI). And last month, Representative Felix Grucci (R-NY), who represents Brookhaven National Laboratory, asked House appropriators to earmark $26.6 million in NSF's budget to start building a proposed physics experiment at the lab, Rare Symmetry Violating Processes. “It's ready to go, and we hope to get it funded,” says an aide to Grucci.

    That request, and others like it, suggests that NSF has its hands full trying to satisfy both the scientific hunger for new projects and the political demand for greater oversight.


    Shadowy 'Weak Force' Steps Into the Light

    1. Charles Seife

    After decades of work, the most mysterious of the fundamental forces of nature is poised to come into much sharper focus

    The nuclear weak force is making strong claims on scientists' attention. A member of the quartet of fundamental forces in the universe, the weak force is feebler than the strong force that binds protons to neutrons and shorter range than both the electromagnetic force that ties electrons to atoms and the gravity that keeps stars and galaxies from flying apart. It is also particularly difficult to study. It exerts a subtle pull on matter and ignores common-sense rules that other forces obey. For example, the force behaves differently if you reflect it in a looking glass—behavior unlike anything else in physics.

    Its quirky character makes the weak force irresistible to physicists. For more than 3 decades they have studied how the force interacts with quarks, the fundamental particles that make up most of the ordinary matter in the universe. Now, with that quest nearing its goal, they are gearing up to continue the exploration with a radically different class of particles: neutrinos. The past few weeks alone saw the debut of the MiniBooNE detector, a million-liter tub of mineral oil at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, and the dedication of the MINOS detector, an enormous set of neutrino-detecting plates in Soudan, Minnesota. Other labs are following hot on their heels, in hopes of understanding the full nature of the weak force. “By experimenting in the neutrino sector,” says Michel Spiro of France's Center for Atomic Energy in Saclay, “we're writing a new chapter in Alice in Wonderland.

    News of the weak

    The through-the-looking-glass properties of the weak force puzzle and delight the physicists who try to understand it. Unlike the strong, electromagnetic, or gravitational forces, the weak force can change the identity of a subatomic particle, transforming an up quark, say, into a down quark or an electron neutrino into a muon neutrino. The force's parlor tricks first came to light in the early 1930s, when physicists were puzzling over a subatomic process known as beta decay. If you watch a clump of cobalt-60 long enough, a neutron in one of its atoms will spit out an electron and become a proton, turning the cobalt atom into nickel. That transformation, the beta decay of the neutron, seemed to violate one of the most hallowed rules of physics, the conservation of momentum. When physicists compared the “action” of the hurtling electron with the “reaction” of the recoiling proton, a little bit of recoil remained unaccounted for.


    Now under construction, B-particle detector at CERN's Large Hadron Collider will boost weak-force studies to new energies.


    German physicist Wolfgang Pauli sought to close the gap by suggesting that, along with the electron, the neutron emitted a tiny neutral particle. In 1934, his Italian colleague Enrico Fermi dubbed that particle a “neutrino” and explained beta decay by invoking a new force, the weak force. Today, physicists realize that beta decay is caused by the interaction of a quark with a carrier of the weak force known as a W particle. In cobalt-60, the W turns a neutron's down quark into an up quark, changing the neutron into a proton and emitting an electron and a neutrino (technically, an antineutrino) in the process. Fermi's version “didn't talk about exchange of particles, but it described beta decay beautifully,” says Jim Cronin, a physicist at the University of Chicago.

    Feeble as it is, the weak force affects all known subatomic particles—quarks as well as leptons, such as the electrons and neutrinos. Neutrinos, in fact, are affected only by the weak force. That makes them exasperatingly hard to detect and experiment with. As a result, researchers turned to other particles to unravel the nature of the weak force. They soon realized that the force has a unique property—one that makes it responsible for all the matter of the universe.

    The weak, the strange, and swarms of B's

    The property, known as charge-parity (CP) violation, is rooted in symmetry. Physicists long assumed that any experiment performed with matter would give the same result as a corresponding experiment with antimatter. This symmetry is known as charge, or C, symmetry. Similarly, they thought, experiments should be identical even if you swap right and left, up and down, front and back, a property known as parity, or P, symmetry. The strong force, the electromagnetic force, and the gravitational force all obey C and P symmetry. The weak force obeys neither. “With the weak force, when a neutrino comes out, it has a handedness, a spin,” says Cronin. “Nature is showing a preference: Nature is left-handed.”

    That partiality breaks the symmetry of the weak force. In the early 1950s, physicists showed that decaying cobalt atoms prefer to spit out electrons in the up direction rather than down—and that in a mirror-image repeat of the experiment, with the orientations of all the particles reversed, electrons would prefer to go down rather than up. You could tell the difference between the two; the weak force violated P symmetry. Soon thereafter, experiments showed that weak interactions also violated C symmetry. Physicists hoped that if they swapped matter with antimatter and reflected the particles' orientations, the weak force would look the same, but the weak force disdained to obey even this CP symmetry.

    These symmetry violations seem to be responsible for the matter in the universe. If matter and antimatter were exactly the same, then the energy from the big bang would have created an equal amount of matter and antimatter—which would then have collided and annihilated each other, leaving a soup of energy. But the early universe had a tiny, tiny preference for matter over antimatter, thanks to CP violation. The wee excess of matter became the stuff of stars and planets and living creatures.

    In 1964, Cronin and colleagues first spotted CP violation in a particle known as a K0 meson, which is made of a down quark and a strange antiquark. The K0 also has a peculiar habit of changing its identity; it can turn into its antiparticle and vice versa. K0 and anti-K0 particles have the same lifetimes, but K0 particles can zoom through a plate of matter much more easily than their antimatter siblings can.

    For a while, physicists thought they had detected two extra K particles, quick-decaying ones known as Kshort and longer lived Klong. Unlike K0 and anti-K0 particles, both these putative particles could penetrate matter with equal ease. But it turns out that, thanks to a weird feature of quantum mechanics known as superposition, the Kshort and Klong particles can be considered to be a blend of K0 and anti-K0 particles; conversely, the K0 and anti-K0 particles are mixtures of Kshort and Klong particles. Each way of looking at K particles is equally valid. It's something like describing a direction on a map: Ordinarily we think of, say, northeast as equal parts north and east, but it's equally valid to describe north as equal parts northeast and northwest. Either pair of compass headings will work as a “basis” for specifying any direction in a plane. Likewise, it doesn't matter whether you think of the particles as K0 and anti-K0 or Kshort and Klong.

    Next step.

    Neutrino detections such as this one from Super-K (bottom) hold the future of weak-force research.


    On a map, northeast is canted at an angle to north; in an analogous manner, the basis for describing the K0 is at an “angle” with respect to the basis for Kshort. The bigger the angle, the more “blended” a Kshort or Klong is. In a similar way, quarks under the influence of the weak force have three “mixing angles” that describe the mismatch between two ways of looking at particles: by their “flavor” basis and their “mass” basis. (Flavor is a property of quarks and leptons that separates each class of particles into different categories.)

    To describe the interplay between the mass and flavor bases, physicists employ an array of numbers known as the Cabibbo-Kobayashi-Maskawa (CKM) matrix. This matrix, which includes the three mixing angles as well as a “phase” that describes the CP-violating nature of weak interactions, provides a concise way of describing how the weak force affects quarks. “You can account for the phenomena with numbers, with certain values in the matrix,” says Cronin. “It's not an explanation, but it's a very good model.”

    For 30 years, scientists have been filling in those values by watching particles decay—particularly K mesons, which, unlike most other particles, violate CP symmetry. By looking at how K mesons decay and react, scientists have measured the asymmetry between matter and antimatter as well as the weak mixing angles with reasonable accuracy.

    Now, however, K mesons are no longer the only game in town. About 4 years ago, scientists started mass-producing their slightly heavier cousins, B mesons, which have bottom quarks instead of strange quarks. Like K0 and anti-K0, the B mesons oscillate between the particle and antiparticle varieties, and they also violate CP symmetry. Because B mesons are heavier than K mesons, they are harder to produce. However, they have a lot more energy and can participate in exotic reactions that are barred to K mesons. Thus, B mesons will give physicists a new way to understand the effect of the weak force on quarks. “In B physics, there are many more decays where you can observe it and measure it,” says Richard Jacobsson, a physicist at CERN, the European laboratory for particle physics near Geneva. “And B physics is so rich, you can get at the same values in many different ways,” reducing the errors dramatically.

    Since 1998, physicists at the Stanford Linear Accelerator Center in California have been using B mesons to make increasingly precise measurements of the particle's ability to violate CP symmetry (Science, 18 December 1998, p. 2169; 23 February 2001, p. 1471). The Belle collaboration based at the KEK accelerator lab in Tsukuba, Japan, recently released its first measurements of CP violation in B mesons. The Tevatron II accelerator at Fermilab will also produce enough B mesons to measure CKM matrix values, as well as some of the properties of weak force-carrying W and Z particles. And where the B factory and Tevatron leave off, the Large Hadron Collider (LHC), the multibillion-euro mega-accelerator under construction at CERN, will continue (see upper table).

    View this table:

    As a result, the role of quarks in studying weak-force physics might be nearing its triumphant final bow. By the time LHC comes online in 2007 or so—barring the discovery of non- Standard Model physics such as supersymmetry—the CKM matrix should be almost complete. “The additional correction on things you can't measure will probably be small,” says Jacobsson. Barring the discovery of supersymmetry or some other exotic physics, he says, “the interest in B physics will definitely go down.” Instead, many weak-force physicists say, the future of their field lies with the particle that started it all: the neutrino.

    Massless no more

    Neutrinos feel the pull of the weak force just as quarks do. For decades, however, physicists considered them a lot less interesting, unable to perform the funky oscillation tricks that come naturally to quark-based particles such as K and B mesons. The reason was that theorists assumed neutrinos had no mass. No mass meant that the neutrinos' mass basis was irrelevant; hence, the weird mismatch of the flavor basis and mass basis couldn't be responsible for any bizarre oscillation behavior.

    The first blow to that view came in the late 1990s, when the Super-Kamiokande (Super-K) observatory in Kamioka, Japan, spotted hints that some muon neutrinos changed into tau neutrinos as the particles pass through Earth (Science, 12 June 1998, p. 1689). This oscillation—just as in the quark sector—meant that the neutrino's flavor basis must be different from its mass basis, something that can be true only if the mass of the neutrino is not zero. Thus, physicists concluded that neutrinos must have mass.

    The discovery of the oscillations cleared up a long-standing discrepancy between theoretical and measured values of the numbers of electron neutrinos streaming from the sun (Science, 26 April, p. 632). It also implied that neutrinos and their lepton kin almost certainly have a mixing matrix analogous to the CKM matrix for quarks. “Barring non-Standard Model flavor-changing interactions between neutrinos and matter, there are neutrino masses and nonzero mixing,” says Boris Kayser, a physicist at Fermilab. “This means that the leptons, including the neutrinos, are very much like the quark side.”

    But the similarities go only so far. Most theorists had assumed that neutrinos would interact with the weak force in roughly the same way quarks do. In that case, their matrix—known as the Maki-Nakagawa-Sakata (MNS) matrix—would be almost identical to the CKM matrix. Among other things, neutrinos' mixing angles, the mismatch between neutrinos' mass and flavor bases, should be small, like those of quarks.

    Wrong. Over the past year or so, Super-K, Canada's Sudbury Neutrino Observatory (SNO), and other neutrino experiments have shown that the neutrino mixing angles are large. Large mixing angles mean big differences between the two ways of representing neutrinos. As a result, whereas a quark looks almost the same in its mass basis as in its flavor basis, an electron neutrino (the flavor-basis depiction of the most common of the three types of neutrinos) is an almost equal mix of two mass-basis neutrinos, with a little of the third thrown in for good measure. “That is an interesting contrast to what goes on in the quark sector,” says Kayser, who hopes that the discrepancy might help theoreticians understand how the weak force and its peculiar mixing properties came to be. “This distinction between quark and lepton mixing probably is a clue to the origin of mixing, but we don't know how to read that clue yet.”

    The first step is to measure the elements of the MNS matrix. Apart from knowing that the mixing angles are large, physicists don't really have a grasp of what the matrix will look like, says Kevin Lesko, a neutrino physicist at Lawrence Berkeley National Laboratory in Berkeley, California: “You want to try to do experiments now that can begin to define the individual components better.”

    Already, experiments such as Super-K and SNO detect neutrinos from the sun and from cosmic ray interactions with the atmosphere. KamLAND, a Japanese experiment gathering data with an old neutrino detector in Kamioka, is measuring the oscillations of neutrinos produced by nuclear reactors at different distances from the detector. These and other experiments—including some in which beams of neutrinos are fired at detectors at various distances—will chip away at the mixing angles and will fill in some values of the MNS matrix (see lower table).

    View this table:

    The work could also bear more exotic fruit. One tantalizing possibility is that neutrinos, like K and B mesons, will violate CP symmetry. If so, the implications could be cosmic in scale. According to Kayser, the tiny asymmetry between quarks and antiquarks can't account for the preponderance of matter in our universe: “It does not work. It's way too small.” Many theorists believe that an extra asymmetry between leptons and antileptons would make up the difference. “The stage is set for exploration of CP violation in the lepton sector,” says Kayser.

    It's also possible that neutrinos, unlike quarks, will turn out to be their own antiparticles—that they are “Majorana” rather than “Dirac,” in physics-speak. Such a discovery would reveal important details about the nature of their CP violation. Physicists have been looking in vain for a sign that neutrinos are Majorana. In particular, they have been searching for a certain “forbidden” atomic decay, a reaction that releases two electrons and no momentum-carrying neutrinos, which is allowed only if neutrinos are Majorana.

    Physicists eagerly await the hard data that will help them tackle such questions. At present, however, they are just beginning to unravel the secrets of neutrinos, measure the elements of the MNS matrix, and understand how the weak force affects a whole family of fundamental particles. “This, right now, is wide open,” says Lesko. “I don't think we have a good understanding right now what it's going to look like. It's kind of fun.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution