News this Week

Science  09 Nov 2001:
Vol. 294, Issue 5545, pp. 1254

    U.S. Enlists Researchers as Fight Widens Against Bioterrorism

    1. Eliot Marshall*
    1. With reporting by Jon Cohen, Martin Enserink, Joshua Gewolb, David Malakoff, and Richard Stone.

    The threat of bioterrorism—which became real this fall—has prompted a flurry of reactions from the U.S. government in a rapidly expanding war against an unknown attacker. Last week, as the number of new anthrax cases in the United States showed signs of tailing off, some of those responses reached deep into the scientific community.

    In an intense effort to track down the source of the attacks, the FBI stepped up a probe of research labs across the country, and in an attempt to improve responses to any future attacks, the Bush Administration recalled two prominent researchers to Washington to help orchestrate government biodefense programs. Congress also began discussing a spate of bills that could have significant implications for scientists. And on the international front, the U.S. government adopted a new negotiating position on a biowarfare treaty.

    Investigations and security. FBI agents have been working for the past month in a probe of U.S. research labs. The objective: to learn whether anthrax spores used in recent attacks could have come from the United States and whether other research organisms might be used as weapons. Some 250 laboratories are registered to handle dangerous organisms, and dozens have been contacted by the FBI. “We're pressing hard to determine” which labs have handled anthrax, Jim Reynolds, a top antiterrorism official at the Justice Department, told a Senate hearing this week. Among those that were asked questions are Louisiana State University in Baton Rouge, the Southwest Foundation for Biomedical Research in San Antonio, Texas, and Brookhaven National Laboratory in Upton, New York. These inquiries, including subpoenas from a Florida grand jury, request lists of employees, descriptions of the strains at the facility, and other details (see next page).

    Tough questions.

    Academic researchers have received government subpoenas seeking information on their anthrax stocks.

    Although the FBI has cast a wide net, its approach appears to have been uneven. Some researchers who work with live anthrax bacteria said this week that they had not been contacted, whereas others who use only small pieces of anthrax DNA (which are not infectious) said that they had been called. In the meantime, many universities are taking inventories of their stock of dangerous organisms. This task is daunting because microbiologists are scientific pack rats: Researchers “collect strains like schoolboys collect sports cards,” says Louisiana anthrax expert Martin Hugh-Jones.

    That fact has energized Congress, which is moving quickly to pass legislation that would require researchers to beef up lab security and register all collections of potential bioweapons with the Centers for Disease Control and Prevention (CDC) in Atlanta. But science and university groups have worked to try to fine-tune proposals, such as one introduced last week by Senator Diane Feinstein (D-CA) that would also require the registration of all lab equipment that might be used to manufacture bioweapons. “They don't understand what they are asking,” says one academic lobbyist. “That would mean registering virtually every piece of glassware.” A comprehensive bioterrorism bill to be introduced this week by Senator Edward Kennedy (D-MA) is expected to narrow that requirement.

    To reduce the risk that another deadly organism might get into hostile hands, the two facilities designated as official repositories of the smallpox virus have already increased their security. Neither one—CDC and VEKTOR in Koltsovo, Russia—offered details. But some measures were obvious: CDC's sprawling campus, for example, is now surrounded by concrete barriers to thwart truck attacks.

    Front lines.

    Army vaccine expert Philip Russell (above) is called into service as postal workers rally over job safety.


    According to one researcher who asked not to be identified, the CDC's smallpox stocks have been secure for a long time. “There are now several more layers to prevent someone from getting to them,” the researcher says. “An outsider would get lost in the labyrinth.” The director general of VEKTOR, Lev Sandakhchiev, adds that his staff has made “major physical security improvements” in the past 2 years. But the “horrendous” recent terrorist attacks were “a wake-up call for everybody in the world,” he says. “We checked through our security system again.”

    New czars. To help coordinate the response to bioterrorism, Health and Human Services (HHS) Secretary Tommy Thompson recruited two decorated warriors back to government service this week. HHS announced on 1 November that Donald A. Henderson, former director of the World Health Organization's program to eradicate smallpox, will head HHS's new Office of Public Health Preparedness. In addition, retired Maj. Gen. Philip Russell, who once ran the Walter Reed Army Institute of Research, will join HHS as a special adviser on vaccine development and production.

    Henderson, who has previously worked both as deputy assistant secretary of HHS and as associate director of the White House's Office of Science and Technology Policy, is leaving his job as director of the Johns Hopkins Center for Civilian Biodefense Studies in Baltimore, Maryland. “If you asked who is a giant in the field of bioterrorism as a scientist, who has incredible credibility in the community, there's D. A. Henderson, and then there's no number two,” says John Bartlett, chief of infectious diseases at Johns Hopkins School of Medicine. “He's a tour de force in the fields of both science and public health.” Whereas Henderson will orchestrate the many branches of HHS that deal with bioterrorism, insiders say that Russell will have a more defined main task: to speed the development of a new anthrax vaccine.

    The failure of U.S. producers to maintain a viable stockpile of anthrax vaccine for civilians has been an acute embarrassment for the government. The contractor hired by the Department of Defense to produce a vaccine for the Pentagon has been closed down for repairs since 1998 (Science, 19 October, p. 498). Two prominent groups have now urged Congress to resolve the impasse by authorizing a new, government-owned, contractor-operated facility dedicated to the manufacture of critical vaccines. A panel chaired by retiring Virginia Governor James Gilmore told Congress last week that “direct government ownership or sponsorship” of a vaccine lab is “the only reasonable answer.” And the governing council of the Institute of Medicine, chaired by IOM president Kenneth Shine, concluded on 5 November that a “National Vaccine Authority” is “long overdue.”

    Industry officials have been in close talks with HHS about these ideas for the past 2 weeks, says Jeffrey Trewhitt, spokesperson for Pharmaceutical Research and Manufacturers of America in Washington, D.C. The company executives think they can fulfill the government's needs without a vaccine agency, Trewhitt says: “Let's set the goals and see what they can do; they believe they can meet the goals” faster than a federal agency can. Carl Feldbaum, president of the Biotechnology Industry Organization in Washington, D.C., also says he's telling federal officials that U.S. companies can make vaccines faster than the government can. “It's doable,” he says, if the industry can have a long-term financial commitment and protection from antitrust actions and private lawsuits.

    Treaty movement. There's a possibility that the anthrax scare could kick-start stalled talks on measures to beef up compliance to the Biological and Toxin Weapons Convention (BWC). Talks on a BWC protocol broke down last summer, when the U.S. delegation pulled out of negotiations due to concerns that enforcement measures—such as lab inspections—might compromise national security and threaten biotech companies (Science, 20 July, p. 414). Keeping a promise to come up with alternative approaches for a 19 November review conference in Geneva, President George W. Bush last week floated several ideas for strengthening the convention. The most compelling U.S. proposal, experts say, is one to allow nations to extradite for prosecution those who mishandle biotoxins. The idea has been kicking around the United Nations for years, but U.S. support may put it over the top.

    Overall, however, observers are unimpressed with other U.S. proposals, including one to devise a “code of ethical conduct” for bioscientists. “The [Bush] Administration is still in a state of denial,” contends a British bioweapons analyst. One way to mend the rift with other countries, says a U.S. Defense Department official, would be for the government to accept “tempered criticism” at the Geneva conference and then quietly resume negotiations on how to comply with the treaty.


    Peer-Review Critic Gets NIH 'Rejects'

    1. David Malakoff

    Two years ago, Stanford University postdoctoral researcher Michael Vagell asked the National Institutes of Health (NIH) for a grant to study how hormones affect the brain. Like about 70% of all grant applications, it was rejected. Normally, Vagell's fate would have remained a secret, because NIH publicizes only the names of grant winners. Last month, however, NIH complied with a court order and reluctantly handed over a list of unfunded applicants to a longtime critic of its peer-review practices.

    NIH officials say that the public release of the names and addresses violates the privacy of applicants and could hurt the careers of young scientists like Vagell. But the man who won the list, retired entrepreneur George Kurzon of Portsmouth, New Hampshire, says that employers, tenure committees, and colleagues “already know” who isn't winning grants. Left in the dark, however, says Kurzon, are those who could make a real difference: foundations, investors, and others who might fund ideas that NIH either rejected or didn't have enough money to fund. He plans to invite rejected researchers to post their ideas on a free electronic bulletin board. That way, he says, scientists can avoid being penalized by a peer-review system that is “inherently flawed, overly cautious, … and unfriendly to real innovation.”

    Some researchers at the center of the storm say they don't see the release of their names as a threat. “Even the most successful researchers don't always get funded, but I can see why someone might feel embarrassed,” says Mark Blumberg, a neuroscientist at the University of Iowa in Iowa City, who, like Vagell, is on the list but was eventually funded by the National Institute of Mental Health (NIMH). Vagell, who says he doesn't mind the attention, finds Kurzon's information-sharing plan “interesting, … [but] I can't imagine that funders are facing a shortage of good applicants.”

    Show and tell.

    George Kurzon wants to publicize ideas that didn't make it at NIH.


    This is the second time that the 71-year-old Kurzon, a Harvard-trained physician who has worked in the pharmaceutical industry and as a venture capitalist, has forced NIH to cough up such a list. In 1980, he won a court order forcing the National Cancer Institute to reveal the names of its unfunded applicants after he learned that it had rejected a proposal from prominent biochemist Albert Szent-Gyorgyi. Kurzon turned the list over to a social scientist studying peer review.

    Kurzon went to federal court again last year, after NIMH rejected a 1999 Freedom of Information Act request for a similar list. In July, a judge found that although neither NIH nor Kurzon had made a strong case, the law requires agencies to make records public whenever possible. So on 12 October NIH sent Kurzon a list of the 800-plus NIMH applicants who weren't funded in the spring of 1999, after informing everyone on the list and inviting them to contact Wendy Baldwin, head of the agency's extramural grants agency.

    “I can't see how a list of names is … the most effective way to advance science,” says Baldwin. The agency already encourages researchers who don't get NIH funding to approach private donors, she says.

    Kurzon thinks that NIH officials are missing the point. The exercise will have been worthwhile if it leads to the funding “of even one overlooked gem of an idea,” he says, adding that he plans to ask every NIH institute to provide updated lists of its unfunded applicants. But scientists may cling to their anonymity a bit longer: Kurzon has yet to raise the money to mail out his invitations or set up his Web site.


    Dopamine May Sustain Toxic Protein

    1. Jennifer Couzin*
    1. Jennifer Couzin is a writer in San Francisco.

    The tremors, stiffness, and slurred speech that accompany Parkinson's disease are rooted in the death of neurons that churn out the chemical messenger dopamine. But precisely what kills these brain cells has long stumped researchers. Now a provocative test tube study suggests that a surprising culprit—dopamine itself—may assist in the neurodegeneration that defines the disease. Parkinson's researchers say the findings are intriguing and worthy of follow-up experiments but caution that they must be confirmed in cell cultures and laboratory animals.

    Neurons in parts of the brain stricken by Parkinson's disease are marked by tangled deposits called Lewy bodies. These clumps are made of the folded, or fibrillar, version of a protein called α-synuclein. Neuroscientists initially assumed that fibrillar α-synuclein—as opposed to the unfolded form common in healthy brains—is responsible for neural demise. Recently, however, researchers have pursued a version of α-synuclein that hovers between normal and fibrillar, called protofibrillar, which some consider far more toxic than fibrils.


    Dopamine may keep α-synuclein in toxic protofibrils (top) by preventing it from forming fibrils (bottom).


    On page 1346, Peter Lansbury of Harvard Medical School in Boston and his colleagues describe their search for compounds that either prevent or encourage protofibril accumulation. Lansbury's team used human α-synuclein produced by bacteria to screen 169 compounds. To the researchers' surprise, of the 15 compounds that inhibited the transition from protofibril to fibril—thus, presumably, making protofibrils stick around in a cell longer—14 belonged to a set of neuromodulators called catecholamines, which includes dopamine.

    The results appeared paradoxical; after all, Parkinson's disease is caused by a crippling loss of dopamine. How could dopamine be worsening the disease? “The whole thing led in a very unexpected direction,” says Lansbury.

    A clue came when postdoc Kelly Conway found that adding antioxidants to the test tube mix reversed the inhibitory effect of dopamine and sped back up the transformation of α-synuclein from protofibrils to fibrils. The dopamine that sends neural messages, which people with Parkinson's disease lack, is stored in synaptic vesicles and protected from oxidation. But dopamine is formed in the cytoplasm, and while there it's easily oxidized. Because the oxidized form of dopamine has been implicated in cell death in earlier studies, and because of his team's new work with antioxidants, Lansbury speculates that the balance between dopamine and its oxidized form goes awry in Parkinson's patients. Perhaps, he says, dopamine is not promptly moved to the vesicles and languishes instead in the cytoplasm, where it's oxidized and sustains protofibrils.

    Others researchers are intrigued: “It's a new way of thinking about the basis for Parkinson's disease … one that ought to be pursued,” says Jeffery Kelly, a chemist at the Scripps Research Institute in La Jolla, California. Virginia Lee, a neurobiologist at the University of Pennsylvania in Philadelphia, says the work complements mounting evidence that protofibrils are harmful and that oxidative stress helps stabilize them.

    Because all three forces—α-synuclein, dopamine, and oxidative stress—are present in some form in normal brains, it remains unclear which is more to blame in Parkinson's. Lansbury's group suspects that a buildup of α-synuclein is the first domino that topples the rest, eventually combining with oxidized dopamine to form protofibrils. Although the theory of excess α-synuclein has not been broadly tested, transgenic mice with excess protein are more likely to develop Parkinson's symptoms.

    Despite the limitations of studying protofibrils in a test tube, scientists are gingerly discussing the study's implications for treating Parkinson's. Many patients currently take a drug called l-dopa, which passes through the blood-brain barrier and there converts to dopamine. If the Lansbury team is correct, say neurobiologists such as Teresa Hastings of the University of Pittsburgh, giving cells more l-dopa could be counterproductive: Some neurotransmitter may get oxidized, sustaining protofibrils and allowing them to kill cells. Studies thus far have not deemed l-dopa treatment harmful, and it often eases symptoms. Still, researchers have long wondered about l-dopa's long-term effect on brain cells. A clinical trial funded by the National Institutes of Health and based at Columbia University is examining whether l-dopa affects the progression of Parkinson's disease; results are expected in about a year.


    Putting a Lid on Life on Europa

    1. Richard A. Kerr

    No earthling would choose to live on Jupiter's satellite Europa. But its deep ocean of liquid water beneath −170°C ice has made the satellite the second most enticing body in the solar system for astrobiologists after Mars. Now that the idea of an ocean is generally accepted, the debate has shifted to how far below the icy surface it lies. If it's too far, even the most tenacious ocean life could be cut off from its best energy source—the sun—and heavily shielded from the prying eyes of astrobiologists. But this issue of Science contains discouraging news. New information about how comets punch into Europa's surface suggests that the ice is more than 3 to 4 kilometers thick, not an optimistic 1 kilometer. That poses a greater obstacle for life in Europa's ocean as well as for astrobiologists.

    Researchers have debated the issue through thick and thin. For the past decade or two, most assumed that at least the upper 20 to 30 kilometers of the 100 kilometers or so of water overlying Europa's rocky core was frozen solid. Calculations suggested that heating of the ice by the tidal pushing and pulling of Jupiter might allow such a thick layer of ice while keeping the ocean from freezing from top to bottom.

    Some analyses of images returned by the Galileo spacecraft orbiting Jupiter tended to support the thick-ice hypothesis, too. Analyzing images from the mid-1990s, planetary geologist Robert Pappalardo of the University of Colorado, Boulder, and colleagues at Brown University in Providence, Rhode Island, saw “pits, domes, and spots” on the surface that tended to be about 10 kilometers in size and spaced about 20 kilometers apart. The most likely cause of such features, they argue, is deep ice warmed and thus softened by the ocean and set churning slowly toward the surface, like so many puffy clouds on a sunny day. Such convection requires ice at least 20 kilometers thick, the lower part of which would be soft to prevent fracturing from the surface to the ocean. Anything living in the ocean would then need 100,000 years to reach the sustaining sunlight or to receive energy-laden chemicals slowly moving down from the surface, where they are produced by Jupiter's intense enveloping radiation.

    Thickness gauge.

    Impact craters on Europa having central peaks, as does Pwyll (diameter 26 kilometers), suggest that the ice is more than 3 to 4 kilometers thick.


    But planetary scientist Richard Greenberg and his group at the University of Arizona in Tucson have an alternative model that's more hospitable to possible life. Looking at more recent Galileo images, they see a whole range of sizes and spacings of features corresponding to pits, domes, and spots; to them, that looks like spotty melting of ice right through to the surface rather than the effects of thick, convecting ice. The ice can melt through, they say, because it is only “a few kilometers” thick, at most. San Andreas-like fractures visible in Galileo images also support the existence of liquid water not far below the surface, says Greenberg: “In our model, the ocean communicates with the surface very easily,” with ocean water—and any indigenous life—rising through some cracks and being squeezed onto the surface and back into the ocean in the course of every Europan day (3.5 Earth days).

    No one is yet ready to say that the Europan ocean can't readily communicate with the surface, but the “ultrathin” ice layer 1 to 2 kilometers thick that is the most optimistic interpretation of “a few kilometers” doesn't square with a new analysis based on Europan impact craters. In this issue (p. 1326), planetary scientists Elizabeth Turtle and Elisabetta Pierazzo, also of the University of Arizona but not part of the Greenberg group, studied the central peaks that form as material flows into the hole formed on the impact of a comet or asteroid. Europan ice, they reasoned, would not have formed central-peak craters if an object had totally vaporized or largely melted through the ice. Computer simulations of large impacts into varying thicknesses of ice suggested that the ice needed to be more than 3 to 4 kilometers thick to form central-peak craters on Europa. “That rules out the very thinnest ice thickness proposed,” says Turtle, at least at the places and times Europa's several central-peak craters formed.

    Another, stereoscopic analysis of surface forms is yielding even larger estimates of ice thickness. Planetary scientists Paul Schenk of the Lunar and Planetary Institute in Houston and William McKinnon of Washington University in St. Louis have calculated that ice more than 6 kilometers thick must underlie plateaus they have found to rise 0.5 to 1 kilometer in height and pits 0.5 kilometer deep. That “rules out the ultrathin case,” says McKinnon, echoing Turtle.

    Although the thicker ice thesis seems to be gaining ground, planetary geologist Robert Sullivan of Cornell University acknowledges that the real world is often not as simple as one extreme or the other. Planetary scientist Christopher Chyba of the SETI Institute in Mountain View, California, says he's also leaning in the direction of thick ice, but “we're not going to feel confident until we get there again.”

    That won't happen anytime soon, however. Budget-strapped NASA has yet to commit itself to a mission to send an orbiter to Europa, which could confirm an ocean from orbit by measuring the tidal squeeze on the satellite. But getting an ice thickness “won't be easy,” says one possible participant.


    Science Office Grows, Nonproliferation Stalls

    1. David Malakoff,
    2. Robert Koenig

    It could have been worse. The new science budget for the U.S. Department of Energy (DOE) isn't flat, as the Bush Administration had requested. But Congress has spent much of the 2.5% increase it awarded the department on pet projects, and it squeezed programs in Russia that protect nuclear stockpiles and employ former weapons scientists.

    The $25 billion spending bill—approved on 1 November and expected to be signed shortly by the president—includes $3.2 billion for DOE's Office of Science, which is the largest funder of basic physical science programs at U.S. universities and government laboratories. Although it follows the White House budget blueprint closely in most respects, lawmakers restored $10 million for fusion studies and tacked on nearly 10% for DOE's Biological and Environmental Research (BER) program (see table). Legislators, however, earmarked nearly all of BER's $84 million in new money for equipment and construction at specific universities—typically in the home states of senior members of the House and Senate spending panels. There is $11 million, for instance, for the new Mental Illness and Neuroscience Discovery Institute at the University of New Mexico in Albuquerque; the state is the home of Senator Pete Domenici, a top Republican on the appropriations committee.

    View this table:

    Given earlier fears of budget cuts, “the bite turned out to be nowhere [near as] bad as the bark,” says Scott Sudduth, the Washington, D.C.-based director of government relations for the University of California. Still, researchers “got rather slim pickings if you consider the important role that science plays in national security,” adds Michael Lubell, a lobbyist for the American Physical Society in Washington, D.C.

    Arms control advocates, meanwhile, failed to increase funding for DOE's nuclear nonproliferation programs. The 2002 budget contains $803 million for arms control programs, $29 million more than the president's request but $69 million less than this year. It also lumps into a single $42 million pot the budgets for two programs—the Initiatives for Proliferation Prevention (IPP) and the Nuclear Cities Initiative (NCI)—aimed at keeping weapons scientists from freelancing for U.S. enemies.

    The decision disappointed IPP officials, who had been expecting a substantial increase from last year's budget of $24.5 million, but buoyed NCI backers, who feared Congress would follow the Administration's wishes and practically kill the $27 million program. “Now it's up to us to figure out how to split up this money,” says DOE's Steven Black, who oversees both programs.

    Congress rejected the Administration's plan to prune DOE's $173 million Material Protection Control and Accounting (MPCA) program, which helps Russia safeguard its nuclear materials. But House and Senate negotiators rejected an effort by Representative Chet Edwards (D-TX) to add $131 million to it. Calling the decision “irresponsible and dangerous,” Edwards and others hope to snare some money for MPCA from the $40 billion antiterrorism package that Congress approved last month but has not yet decided how to spend.


    Government Spurns Human Genome Effort

    1. Vladimir Pokrovsky,
    2. Andrey Allakhverdov*
    1. Vladimir Pokrovsky and Andrey Allakhverdov are writers in Moscow. With reporting by Richard Stone.

    MOSCOW—As most nations rush to mine the riches of the human genome, Russia is moving to eviscerate its 12-year-old National Human Genome Program (NHGP). Science has learned that the science ministry plans to strip the NHGP of its special funding status and fold the money into a general pot for basic research. Beyond imposing a 50% cut in direct spending on genome research, the move will affect millions of dollars in other research activities that the genome program helped to manage.

    The NHGP was the brainchild of Alexander Bayev, a molecular biologist at the Engelhardt Institute of Molecular Biology in Moscow who persuaded Mikhail Gorbachev to establish a national genome program in 1988. The fledgling effort received about $20 million a year over the next 2 years, on par with the U.S. program. Funding ebbed, however, after the Soviet Union's dissolution, a decline that accelerated with Bayev's death in 1994. Still, NHGP researchers pioneered hybridization sequencing. And dozens of Russians are involved in genome-related U.S. bioinformatics projects, says an official at the National Institutes of Health in Bethesda, Maryland. “Despite a shortage of resources, they have made use of their financing very effectively,” says Valery Soyfer, a molecular biologist at George Mason University in Fairfax, Virginia, and an expert on the history of Russian genetics.

    View this table:

    The new cuts, which end a block grant to the NHGP, threaten to derail projects involving approximately 400 researchers. The NHGP's funds will now be part of a “special purpose” program at the science ministry covering 120 basic research areas. However, only a handful of topics—including tumor genomics and genome software development—cover core areas within the genome program. Each topic will be supported by a single project, with total spending on genome-related research not to exceed $180,000 in 2002. The ministry will appoint its own panel to choose meritorious projects.

    “No one formally closed the NHGP or dismissed the council,” says the Engelhardt's Lev Kisselev, NHGP council head. “But we will no longer choose grantees, and we cannot decide whether the funding will go to a worthwhile project or not.” Ministry officials did not respond to requests for comment.

    Scientists rue what they see as the imminent demise of a program that maintained a sense of community for Russian genome researchers, sponsored workshops, and supported projects in key areas such as bioinformatics and population genetics. If the program were to die, says Evgeny Sverdlov of the Institute of Molecular Genetics in Moscow, “the infrastructure will die, and that will be very bad.”


    Insects Rank Low Among Genome Priorities

    1. Elizabeth Pennisi

    ARLINGTON, VIRGINIA—Insects get no respect, at least from the U.S. agencies that support genome sequencing. That was the grim news here last week at the Comparative Insect Genomics Workshop, sponsored by the U.S. Department of Agriculture (USDA), where entomologists interested in genomics argued for deciphering the genomes of several insect groups. Not only are insects the most diverse creatures on Earth, but they also cause more than $26 billion in damage annually to crops and livestock.

    The USDA, however, can't afford to do anything about this. “Animal genomes are on the radar screen, but insect genomes are not,” says entomologist Mary Purcell-Miramontes of the USDA Cooperative State Research, Education and Extension Service. And that seems to be the case throughout the federal government.

    Only two insects, both biomedically important, have gotten the nod. The genome of the fruit fly Drosophila melanogaster, long studied by geneticists, was deciphered in March 2000 (Science, 24 March 2000, p. 2181). This year, work began on the malaria mosquito Anopheles gambiae. But proposals to study the genetic makeup of other insect species have yet to get funding, even though entomologists say such research could lead to new ways to fight pests and protect pollinators.

    For instance, Gene Robinson of the University of Illinois, Urbana-Champaign, has marshaled support among insect researchers to sequence the honeybee genome. Honeybees are critical for the pollination of many crops, he explains, and some of its relatives are useful in biocontrol. The entomologists think the project warrants the estimated $6 million price tag. But no one is biting, yet.

    The same is true for the 530-million-base genome of the silkworm Bombyx mori, which may shed light on pest moths and butterflies. An international consortium was formed 3 months ago, and Kazuei Mita of the National Institute of Agrobiological Sciences in Tsukuba, Japan, has done some preliminary work on the genome, but funding is not yet forthcoming.

    Funding buzz.

    Currently, there's little support for sequencing insect genomes.


    The USDA's internal research arm, the Agricultural Research Service (ARS), budgets some $60 million for agricultural genomes. But about two-thirds of that goes toward protecting genetic diversity important for agriculture. Most of the remaining money goes to genomics research on domestic animals and crop plants, says Leland Ellis, ARS program leader for genomics and bioinformatics: “Right now there is zero for insect genomes.”

    Other federal agencies also come up short. The Department of Energy has decided to focus on organisms involved in energy production, bioremediation, or carbon sequestration, says DOE's Ari Patrinos—and insects don't fit the bill. Likewise, the National Science Foundation, which over the past 4 years has spent $215 million on plant genomics, won't tackle insects, warns NSF's Chris Cullis: “We'll not be able to fund the sequencing of an aphid no matter what damage they are doing [to plants].” The National Human Genome Research Institute (NHGRI) plans to sequence the genome of a sister species of Drosophila. But, says NHGRI director Francis Collins, “unless it applies to human health, NHGRI is not likely to get involved.”

    To improve the funding situation, the entomology community needs to pull together and garner the support of farm commodity groups much the way the National Corn Growers Association worked to get funding for plant sequencing (Science, 23 October 1998, p. 652), says Ellis. Otherwise, says Purcell-Miramontes, “very little is going to happen.”


    Mirage of Big Budget Boost Evaporates

    1. Ben Shouse

    CAMBRIDGE, U.K.—Italian scientists are up in arms over government plans to drastically scale back a promised increase in science funding in 2002. More than 5000 researchers have signed a petition opposing legislation before Parliament that would eliminate all but $200 million of a scheduled $900 million boost. The new budget “will simply ruin the possibilities for Italian scientists,” argues Nobelist Rita Levi-Montalcini, former director of the Institute of Cell Biology in Rome.

    Scientists had expected to receive $8.2 billion in 2002, up 12% over this year's spending. But that promise was made by Giuliano Amato, whose government was replaced after elections last May. The new administration, headed by Silvio Berlusconi, has made science one of the biggest losers in a review of its predecessor's spending plans. The government puts a positive spin on the change, noting that it doesn't shrink current levels. “There will be no cuts for universities and research” next year, says Guido Possa, vice minister at the Ministry of Education, Universities, and Research.


    Italian scientists are unimpressed. In the newspaper La Repubblica, Levi-Montalcini last week accused Berlusconi of “betrayal.” “They don't care,” adds Renato Dulbecco, an Italian-born Nobel laureate at the Salk Institute for Biological Studies in La Jolla, California. The new budget numbers, he says, will have an immediate effect in preventing the country's National Research Council (CNR) from replacing researchers who retire from its staff.

    Italy can ill afford such policies, say scientists. The country's research spending stands at 1% of the gross national product, compared to the European average of 2.2%, according to a petition from the Italian Association of Doctoral Students protesting the 2002 budget. The group warns of a “lost generation” of young talent driven away by poor funding.

    Funding isn't the only issue that has scientists fuming. One member of Parliament, Marcello Pacini, has proposed privatizing the CNR, arguing that the private sector would do a better job of supporting research. Scientists are hoping to knock down such an idea before it finds its way into legislation. Dismantling central planning, insists CNR president Lucio Bianco, would spark a crisis in Italian research.

    Despite their protests, scientists aren't optimistic about their chances. Indeed, many regard the budget retrenchment as a fait accompli, predicting its passage later this month without significant changes. “It is difficult to think of hope,” Dulbecco says.


    Reality TV Puts Group Behavior to the Test

    1. Ben Shouse

    CAMBRIDGE, U.K.—Two British scientists are preparing to take advantage of the popularity of “reality TV” to recreate a notorious psychology experiment in which students played the roles of prisoners and guards. Skeptics, including the researcher who designed the original experiment at Stanford University in 1971, fear that the BBC production could rerun the abuses that brought it to a halt after 6 days. But the researchers say that the show offers an excellent opportunity to answer pressing questions about the psychology of racism, oppression, and terrorism.

    The Stanford experiment, conducted by psychologist Philip Zimbardo, took place in the basement of the psychology building, which had been converted to look like a jail. Immersed in the situation, the 9 prisoners and 9 guards quickly internalized their assigned roles, the guards becoming brutal and the prisoners at first rebellious and then utterly compliant. Even the researchers acted more like wardens than scientists, suspecting that the prisoners were faking anxiety to gain early release and helping the guards thwart a rumored jailbreak. The experiment, planned to run for 14 days, was stopped after a colleague objected to its brutality.

    The study demonstrated the influence of group pressure on individual behavior. Other experiments during the 1970s confirmed the power of social context. In one, subjects stayed in a room that was filling up with smoke because others seemed unconcerned; in another, they obeyed a lab-coated scientist's orders to deliver what they thought was an electric shock to a human subject. The specter of these disturbing experiments has prevented further realistic, large-scale tests of group psychology.

    Too real.

    Strip searches and delousing helped student “guards” assert power over “prisoners” in a 1971 experiment.


    Then along came reality TV, which puts people in artificial situations for sheer entertainment value. Stephen Reicher of the University of St. Andrews, U.K., and Alex Haslam of the University of Exeter, U.K., accepted an offer to create a show with a stronger experimental basis. “This is a piece of science being filmed,” says Reicher, who with Haslam will select 15 people to be assigned the role of either guard or detainee. The researchers have chosen a setup similar to Zimbardo's but with a less oppressive atmosphere and safeguards such as independent observers and clear boundaries for subjects' behavior. The BBC will televise the results, but the researchers retain control of the experiment's design and presentation.

    Reicher and Haslam say this is a unique chance to test “social identity theory,” which posits that group identity can override individual personality in shaping behavior. Dominic Abrams, a psychologist at the University of Kent in Canterbury, agrees: “It is rare that one gets an opportunity to simulate a powerful situation.” And in the wake of the 11 September attacks, there is an urgent need for such research. “We don't have to be part of a terrorist cell to gain insight into the psychological processes involved with terrorism,” he says.

    Large-scale social psychology studies can cost hundreds of thousands of dollars, Abrams says, and TV companies may be the only source of funding. Haslam says safeguards alone will cost more than $100,000, but he and the BBC declined to disclose the overall budget for the program.

    Although crews have not yet begun filming, Zimbardo and others have expressed concern that entertainment will be the overriding factor in carrying out the experiment. “There is no question in my mind but that the BBC and their consultants are hoping for something dramatic to erupt, to make it riveting for viewers,” Zimbardo says. He says he declined the BBC's offer to participate because of the danger to the research subjects. Excessive precaution could also doom the experiment, says Peter Collett, a retired University of Oxford psychologist who consulted on the reality TV program Big Brother. “If we don't get the phenomenon that Zimbardo observed, then the whole thing is pointless,” he says.

    Reicher and Haslam insist there is a middle ground between cruel and dull. For one, the study will tone down the power imbalance between prisoners and guards through variations in housing, dress, and status, with the hope of exploring questions Zimbardo left open. For example, they will examine whether groups can have positive effects and if the results might also apply to milder social situations, such as relationships between employers and employees.

    The dangers of Zimbardo's experiment and the trivializing influence of reality TV are the “Scylla and Charybdis” of the new project, Haslam says. Psychologists may differ on the potential perils of the study, but they agree on the importance of its goals. Viewers and researchers alike will have to wait until the show premiers next year to see if the partnership of science and television survives these treacherous waters.


    Black Hole Blazes Away Without a Fuel Supply

    1. Mark K. Anderson*
    1. Mark K. Anderson is a writer in Northampton, Massachusetts.

    The massive jets of supermassive black holes—plumes of gas and dust that extend for thousands of light-years from the centers of some galaxies—require considerable reserves of firepower. The most likely source is the giant doughnut-shaped cloud of gas and dust thought to surround such black holes. But scientists have now found that the black hole at the center of a nearby galaxy called M87 somehow maintains its jets without this vast stockpile of fuel. The apparent paradox has theorists baffled.

    “The most directly puzzling thing is the ‘Here we see it, here we don't’ aspect,” says Julian Krolik, an astronomer at Johns Hopkins University in Baltimore. “What is striking here is that active galactic nuclei of both greater and lesser power than M87, which also resemble M87 in many other respects, are wrapped in thick clouds.”

    Until recently, the energy-spouting center of M87—an elliptical galaxy 50 million light-years from Earth in the constellation Virgo—was thought to be a typical active galactic nucleus, powered by a typical supermassive black hole. However, last year astronomer Robert Antonucci of the University of California, Santa Barbara, noticed that the cloud seemed to produce surprisingly faint infrared emissions. But the observations left many questions about the cloud unanswered.

    Black magic.

    The black hole at the center of galaxy M87 has a brilliant jet but apparently no torus-shaped gas-and-dust cloud to fuel it.


    Then a team led by Eric Perlman, an astronomer at the University of Maryland, Baltimore County, observed M87 with the Gemini North telescope in Hawaii. In the 1 November Astrophysical Journal Letters Perlman's team reports that longer observations have provided a much clearer picture of the infrared emissions of M87's black hole. Comparing the emissions from the torus-shaped cloud with the energy coming from the jet, Perlman found that M87's torus-to-jet ratio was only about 1/1000 as great as those of other active galactic nuclei such as Centaurus A and Cygnus A.

    Perlman's findings will force theorists to revisit their models to account for black holes without giant dust clouds, Krolik says. “This makes it harder to produce any model,” he says.


    Seed Treaty Signed; U.S., Japan Abstain

    1. Daniel Charles*
    1. Daniel Charles is a science correspondent at National Public Radio.

    Delegates from 116 nations have agreed on a landmark treaty intended to ease exchange of seed collections held in the world's agricultural “gene banks.” The United States and Japan were the sole holdouts, both abstaining from a final vote taken 3 November in Rome.

    The agreement, formally known as the International Treaty on Plant Genetic Resources, mandates the free exchange among plant breeders of seeds from 35 crops, including major cereals such as rice, wheat, and corn (Science, 26 October, p. 772). Other crops, however, including soybeans, tomatoes, and peanuts, are not included in the treaty after nations with extensive collections insisted on maintaining national control. Many nations have adopted laws restricting the export of such “genetic resources” since the international Convention on Biodiversity entered into force in 1993.

    Under the new agreement, any company that uses seeds from public gene banks to breed a new variety must pay royalties into an international fund dedicated to preserving agricultural biodiversity. The requirement applies only to varieties that are unavailable to other researchers because they are covered by patents or treated as trade secrets.

    African nations pressed for a total ban on patenting of plant genes obtained from public gene banks, while the United States rejected any restrictions on patenting that would contradict U.S. law. In the end, in wording that many negotiators admitted was ambiguous, the treaty outlawed patents that would restrict the ability of gene banks to distribute “genetic parts and components” in their original form.

    U.S. negotiators felt that this still might block patents on genes that an inventor had isolated and purified from plant seeds. European delegates, meanwhile, were angered that so many crops remain subject to national restrictions, potentially crippling efforts by nonprofit plant breeders to develop improved varieties.

    Despite such disagreements, the final vote triggered a spontaneous celebration. “There was half a day of wild emotion,” says Pat Mooney of the ETC Group, a Canadian-based advocacy group that has followed the negotiations. “People were hugging each other. The U.S. negotiators were partying, too.”

    The treaty will go into force when 40 nations ratify it. Mooney says that the U.S. abstention will not cripple the effort and that the United States is expected to abide by most of its terms.


    Science Comes First, Panel Tells NASA

    1. Andrew Lawler

    NASA has been told to revamp its current plan for the international space station and put greater emphasis on science. A new report by an independent panel led by former aerospace executive Thomas Young says NASA must come up with realistic costs for building the orbiting lab, pay more attention to its research components, and adopt a new management structure before finishing with construction.

    The 20-member panel, which delivered its blunt report on 2 November, was set up in July by the White House and NASA to analyze the station's costs, which have nearly doubled to $30 billion in the past 4 years. The task force unanimously concluded that even its plan for a scaled-down version “is not credible.” But the panel, which could not even guess at what the current price tag is, offers the agency a reprieve. If NASA can fix the station's problems in the next few years, then the government could consider enlarging the station to accommodate a crew of six. That recommendation marks a political middle ground between White House officials who don't want to spend any more money on the program and agency managers, researchers, and international partners who want a top-of-the-line facility.

    The high-powered team of financiers, engineers, and scientists also went beyond its limited charter to tell NASA to emphasize research and to put biology at the top of the research agenda. “The space station needs to be looked at differently; it's a science mission,” says Young. Some members want the Bush Administration to see the station as the first step toward future human space exploration. “The space station with nothing to follow it is worthless,” says one. Adds panelist Rae Silver, a Columbia University research psychologist: “What's needed is strong leadership and clear vision.”

    Such qualities may be hard to come by. Longtime NASA Administrator Dan Goldin steps down this month, at a time when the White House Office of Management and Budget is openly hostile to additional funding and Congress is skeptical of NASA's ability to deliver on its promises to provide a worthwhile laboratory. The fiscal constraints caused by a lagging economy and the focus on antiterrorism spending will hinder development of a long-term space strategy, Administration officials say.

    Research delayed.

    NASA's original research budget, adopted in 1994, has been stretched out and shrunk to make room for rising construction costs.


    The current station crisis, simmering for years, came to a boil this spring after the White House ordered NASA to trim costs. The agency cut the planned crew size in half and abandoned a living-quarters module and a rescue vehicle, sparking an outcry from potential users. But even those cuts are not enough to keep the station within the $8.3 billion spending limit between 2002 and the scheduled completion of the core U.S. portion in 2006.

    The report doesn't tot up the bill, but panel members say privately that more than $1 billion extra is needed. The panel recommends that NASA find the money by limiting shuttle flights and other aspects of its human space-flight budget. Halting work on the core station, however, would have “significantly adverse impacts on the science,” Young warns. The primary international partners—Canada, Europe, Japan, and Russia—also fear that a three-person crew would limit their access and capabilities.

    If NASA can demonstrate they can complete the core program in a credible manner, then the Administration should consider adding the hardware necessary for a six-person crew, the panel concludes. But neither the panel nor NASA would estimate how much that additional hardware would cost. In the meantime, Young's group suggested boosting research dividends by docking two Russian Soyuz vehicles for at least 1 month out of six at the station. The arrangement would allow a six-person crew to conduct more experiments.

    The panel also urges NASA to turn the massive engineering project into a realistic science program. “NASA has not been good at prioritizing its research” for the station, Silver says. “The whole program until now has been controlled by engineers.” The panel calls for NASA to create a science deputy in the space station program and to coordinate better the research and space-flight offices. Planners must also come to grips with the 40% loss of buying power that resulted from diverting into construction some of the $3.8 billion promised in 1993 for research.

    One victim has been the station's centrifuge, initially the responsibility of NASAand now being built by Japan. The 2008 launch date for a large centrifuge is “unacceptable,” the report declares about a facility needed to test the effects of microgravity on living organisms. “If you are going to do the kind of science that gets published, you need a centrifuge,” says Richard Roberts, panel member and Nobel laureate at New England Biolabs in Beverly, Massachusetts. And biology should be king, panelists add, if NASA hopes to obtain the sort of knowledge needed for continued human exploration of space.

    A clearer idea of what NASA wants to accomplish is critical to a successful science program, say panelists. “You have to know the purpose of the station and view it as a way point toward something else,” says Robert Richardson, vice president for research at Cornell University in Ithaca, New York. “The vision issue was the most heated part” of panel discussions, recalls Silver. Before they can focus on the long term, however, NASA's new leaders must first survive the current national crisis and the resulting tight fiscal constraints.


    Spooky Twins Survive Einsteinian Torture

    1. Charles Seife

    It's a nagging truth that all physicists must face: Relativity and quantum mechanics don't mix, and when they square off, Einstein loses. Now Swiss physicists have brought the two great theories into the arena again. In an experiment that turns commonsense notions of causality on their head, the scientists showed that relativity's tools for dealing with the flow of time are irrelevant in the submicroscopic realm of quantum processes.

    The experiment, conducted at the University of Geneva, explored the properties of pairs of particles whose fates are linked through a mechanism called entanglement. As long as physicists don't examine them, such so-called Einstein-Podolsky-Rosen (EPR) pairs enjoy a wishy-washy existence, not committing themselves to any particular states of properties such as polarization. But jolt one of the particles into choosing—say, by noting its existence with a detector—and the other instantly feels the tweak, even if it's millions of light-years away. If one particle is detected with horizontal polarity, for instance, the other might instantly assume vertical polarity.

    Lab experiments have repeatedly confirmed that this “spooky action at a distance” operates faster than light, although physicists have shown that it doesn't violate relativity because it can't be used to send faster-than-light messages. In the mid-1990s, however, Swiss physicists Antoine Suarez and Valerio Scarani realized that EPR pairs pose a different sort of relativistic problem, because it's not always clear which particle is tweaking which. The reason is that according to Einstein, observers in different reference frames can disagree about the order in which events occur.

    In a classic thought experiment, for instance, physicists like to imagine a person with a 15-meter-long pole running into a 15-meter-long barn at four-fifths the speed of light (see figure). To an observer looking down from the rafters of the barn, the streaking pole seems to be contracted to 9 meters, so it fits entirely within the barn. This means an electronic sensor can (a) shut the front door and then (b) open the back door. But from the pole's point of view, the barn is moving. It shrinks to 9 meters long, while the pole retains its full length of 15 meters. Why doesn't it smash into the barn door? Because the order of events is different from the runner's point of view. The pole carrier clearly sees (b) the back door open before (a) the front door shuts, the opposite of what a stationary observer sees.

    What next?

    Quantum experiment's relativistic quirks resemble those of a pole moving through a barn at near the speed of light. An observer in the barn would see a short pole and both doors closed at once, but the runner carrying the pole sees a foreshortened barn with at least one door always open.


    Likewise, if two scientists are moving with respect to each other when they measure each half of an EPR pair, they might disagree about who measured the particle first. How could the twins be “communicating” if both scientists think that their twin is the sender and the other is the receiver? In such a “before-before” situation, Suarez and Scarani argued, the two particles can't be communicating at all. The spooky action must fall apart.

    The Suarez-Scarani theory suffered a setback last year, when Nicolas Gisin of the University of Geneva and his colleagues put it to an ingenious test (Science, 17 March 2000, p. 1909). They set up an experiment in which a laser spat out entangled pairs of photons. After zipping through fiber-optic cables, each entangled photon struck a beam splitter, which gave the entangled photons a “choice” of paths leading to different particle detectors.

    To bring relativity into play, Gisin used a whirling drum as a stand-in for one of the device's stationary photon detectors. The drum's motion created an Einsteinian before-before situation, in which each detector thought that it had measured the photon before the particle's twin struck the other detector. Contrary to Suarez and Scarani's theory, the particles stayed entangled. The refutation wasn't quite airtight: Skeptics pointed out that the Suarez-Scarani interpretation could still be true if the particles made their “choices” of path before they struck the detector—at the beam splitters, for example.

    The new experiment closes that loophole by showing that the particles still communicate even if they make their choice at the beam splitters. Using nearly the same setup, Gisin's team—with Scarani added—replaced the moving detector with a stationary one and made the stationary beam splitters into moving ones by pumping sound waves through crystals. In a paper submitted to Physical Review Letters, Gisin and his team describe how they set the speeds of the beam splitters to create a before-before situation. As in the earlier experiment, the particles remained entangled. Although each particle hit the beam splitter “first,” and thus thought it was the sender rather than the receiver, the particles were communicating just as well as when the beam splitters were stationary.

    The results, Suarez says, leave the theory he and Scarani proposed no wiggle room. “The notion of time makes sense only in Einstein's world,” he says. “It doesn't make sense in the [quantum world]. It cannot be described in terms of before and after.” And for those who prefer to live in a world of cause and effect, spooky action at a distance just got even spookier.


    Biodefense Hampered by Inadequate Tests

    1. Martin Enserink

    To prepare for the next bioterrorist attack, researchers say they need faster and more accurate ways to detect the spread of deadly microbes

    The recent spate of anthrax attacks plaguing the United States has not just created panic but also strained public health labs to their limits. Many thousands of tests have been performed literally around the clock to monitor the spread of the disease and prevent further deaths—and yet the public has found the results often confusing and sometimes contradictory.

    Many people in America's new hot zones became worried about flulike symptoms, potentially the first signs of inhalation anthrax, and wanted to get tested—only to find out that they couldn't. Buildings that tested positive—such as a Microsoft office in Reno, Nevada—were later declared safe, or vice versa.

    But scientists say such problems are unavoidable: The arsenal of tests they have to work with falls short, they say, forcing them to strike an uncomfortable balance between speed and accuracy. “It's obvious that we need new diagnostics” to detect biowarfare pathogens, both in the environment and in the human body, says Jim Hughes, director of the National Center for Infectious Diseases in Atlanta, part of the Centers for Disease Control and Prevention (CDC). Diagnostics are needed “not just for bioterrorism candidates but also for the diseases that they're easily confused with,” he adds. Such tests need to combine three features that current diagnostics often don't have: speed, accuracy, and ease of use.

    Now, prodded by the unnerving events of the past 4 weeks, government and industry labs are stepping up efforts to develop tests that can detect the presence of anthrax and tell whether a person is infected. Experts predict a tremendous push to develop many new tools in the coming years. Indeed, companies at the forefront of pathogen detection find themselves among the few that are doing very well since 11 September.

    Tests and trade-offs

    As the four deaths to date from inhalation anthrax illustrate, this bacterium can kill very quickly, and there are no tests to reliably determine early on whether somebody has either been exposed to it or been infected. Thousands of people have undergone nasal swabs. But although such sampling provides an idea of how far contamination has spread within a building, it says little about a person's risk. Even if spores are absent from the nose, they may be present in the lungs, waiting to germinate; likewise, finding spores in a nasal swab indicates exposure but not necessarily infection.

    Even during the initial phase of the disease, when flulike symptoms such as fever, malaise, or nausea occur, there's no easy way to locate the germs within the body. They might be present in the lungs or lymph nodes, but to detect them would require invasive and painful procedures. Researchers are working on better ways to spot infection—for instance, by looking for signs in patients' breath—but those are a long way off.

    In the absence of such tests, CDC is assembling a detailed clinical picture of the current cases, says Hughes. The goal is to develop an algorithm that physicians can follow when deciding who to monitor closely. “That's on the fast track here,” Hughes says.

    In contrast to the dearth of diagnostic tools, there's an arsenal of tests to deal with environmental samples, such as suspicious packages and swabs from mail-sorting machines. First on the scene at the Hart Senate Office Building, for instance, were hazardous materials teams wielding hand-held devices smaller than a pack of cigarettes. These devices are a refinement of an instrument first developed in the early 1990s by researchers led by James Burans at the Naval Medical Research Center in Silver Spring, Maryland. The device works on a simple idea: A bit of the suspicious material is suspended in a buffer liquid and allowed to run across a membrane that contains anthrax-specific antibodies, attached to gold particles. If the sample contains the microbes, they'll drag the antibodies with them, creating a visible line somewhere on the membrane within 15 minutes.

    Nosing about.

    Nasal swabs can show whether a person was exposed to anthrax spores; they do not indicate infection.


    Recently this “hand-held assay,” as it is known in Navy circles, has been transformed into an easy-to-use commercial test strip—much like a home pregnancy test—by Tetracore, a company founded by former Navy researchers. The test, called BioThreat Alert, is now being sold to cities, counties, hazardous materials teams, and security companies in record numbers. (Tetracore CEO William Nelson says he has even been approached by people interested in turning the assay into a home test kit. “They say they can sell 20 million of these tomorrow if we put them in Wal-Mart,” he says—but home testing would be a dangerous idea, he adds.)

    Although easy to use, these rapid tests are not very sensitive; BioThreat Alert, for instance, requires some 10,000 spores to give a positive reading, says Nelson—so the test needs to be followed by a more sensitive one. These devices also are not very specific, says Calvin Chue, an expert in pathogen detection at the Johns Hopkins Center for Civilian Biodefense Studies in Baltimore: The antibodies can cross-react with closely related and harmless microbes present in the environment, such as Bacillus cereus.

    To test environmental swabs from mailrooms and offices, CDC relies on an old standby: a culture test. Environmental samples are placed in a broth, allowing the bacteria, if they are present, to multiply. Once grown in sufficient quantities, the bacterial colonies can be tested in several ways. For instance, anthrax colonies growing on a dish with sheep blood agar usually have a “tenacious consistency,” according to a CDC lab manual, and when prodded, “the growth will stand up like beaten egg white.” With so-called Gram staining, the bacteria often appear as encapsulated, purple rods under a microscope.

    Culture tests are highly sensitive, says Chue, and they rarely yield false positives. But they take 18 to 24 hours to finish. The high sensitivity also poses a new problem: It can detect amounts whose meaning from a public health point of view is unclear. Tests at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) led researchers to conclude that 8000 to 10,000 spores are needed to infect a person. (By comparison, it takes only 10 bacteria to infect someone with the plague.) But evidence from this fall's experience suggests that the number may be lower, and it's hard to say whether any amount is “medically insignificant,” as the CIA wishfully labeled the traces found in its mailroom.

    Nor does anybody know whether small numbers of spores might have been present in some buildings even before contaminated letters entered the mail system. Very small numbers of spores lie dormant in the soil in many places. Researchers say it seems unlikely that they would also hang around in urban offices, but until last month, nobody had ever looked systematically.

    Another highly sensitive test—the so-called polymerase chain reaction (PCR)—can give an answer much more quickly than a culture. In PCR, enzymes seek out and amplify stretches of DNA unique to the microbe and copy them repeatedly until they become detectable—say, with a fluorescently labeled probe molecule. Most PCR machines can produce an answer within 2 to 4 hours.

    But PCR is still fairly labor-intensive, and it also has the risk of false positives, because amplified DNA from one sample is easily carried over to another. So for now, says Hughes, CDC is sticking to culture tests for the huge volumes it's processing.

    But PCR will assume a more prominent role in the detection of pathogens now that the technology is being miniaturized and made portable, predicts Stephen Morse, a public health professor at Columbia University and a former manager of the Advanced Diagnostics Program at the Defense Advanced Research Projects Agency (DARPA). PCR amplifies genetic material through a series of cycles in which the sample is heated—causing the double causing the double helix of DNA to split—and cooled, to let the polymerase enzyme assemble a new copy on each of the two strands.

    Test case.

    Small and portable PCR machines help detect biowarfare agents even when there are no labs around.


    Companies such as Cepheid in Sunnyvale, California, and Idaho Technology in Salt Lake City, Utah, have built PCR machines whose reaction vessels are minuscule and can be heated and cooled in seconds; a thorough test can be completed in as little as 20 minutes. Just like the videophones that allow modern-day war correspondents to work in the most remote locales, portable versions of these new PCR machines fit in a hefty suitcase that can be taken virtually anywhere. And because the reaction is monitored from the outside, the vessels don't need to be opened at the end, as with older PCR machines, says Cepheid vice president Bill McMillan, limiting the risk of contamination and false positives.

    Until recently, CDC and the military bought limited numbers of these machines; now demand has soared, and Cepheid's stock price has tripled since 11 September.

    With funding from USAMRIID and DARPA, among others, Cepheid is also trying to eliminate another time-consuming step: sample preparation, which includes steps such as breaking open cells, purifying the DNA, and removing compounds that inhibit polymerase enzymes. The goal is to have all those steps done inside a disposable cartridge. The military's “dream,” says McMillan, “is that kids just out of high school can run them.” The company plans to present a prototype cartridge for anthrax to the Army by the end of the year.

    Beyond PCR

    But even such PCR tests probably would not be enough to deal with a large-scale bioterrorist attack or the use of biowarfare agents in a war, says Morse. Tens of thousands of people would need to be tested—which is why the military is funding research into the next generation of diagnostics.

    Some researchers are looking at microarrays, or “DNA chips.” Fast, small, and accurate, these chips would be ideal for use in the field. Chips can scan for the activity of hundreds or even thousands of genes at once, after which a computer could compare the pattern of active genes with a database of known pathogens and make a quick identification.

    In yet another approach, called matrix-assisted laser desorption/ionization mass spectroscopy, microbes' contents are shattered, producing a spectrum of, say, peptides or nucleic acids, which again can be compared to similar spectra of known bacteria.

    Future techniques may also overcome the current difficulty of determining who's infected and who's not. A team led by Robert Lad at the University of Maine, Orono, has received DARPAfunding to develop and test sensors that could detect minute levels of nitric oxide—a compound released by blood cells called macrophages, the body's scavenger cells, during an infection in the lungs. Such a device could help doctors decide who might have anthrax or another pulmonary infection early in the course of the disease.

    However, that project is currently “just conceptual,” says Morse—and even if it proves possible, it would take years of research. For now, health officials have to make do with what they have—and people who have potentially been exposed will have to go on taking their antibiotics.


    The Other Global Pollutant: Nitrogen Proves Tough to Curb

    1. Jocelyn Kaiser

    Experts call for international cooperation to slash nitrogen pollution, which they say ranks with greenhouse gases as an environmental threat

    POTOMAC, MARYLAND-To get a handle on one of the world's biggest environmental headaches, think about dinner. Say that tonight you eat 100-gram helpings of both rice and chicken. Producing those foods in rice paddies and chicken farms required 40 grams of nitrogen in fertilizer, 90% of which was wasted, leaking into the soil, water, and air. Add the 4 grams of nitrogen from the meal that you'll leave in the toilet, and that's part of your daily contribution to nitrogen-related problems such as algal blooms and smog, says biogeochemist Jim Galloway of the University of Virginia in Charlottesville. “Once [nitrogen] is out there, it just keeps circulating”—and polluting. Nitrogen is an essential element for the crops that feed the world's 6 billion people. But a surfeit of nitrogen, from fertilizers and the burning of fossil fuels, is harming ecosystems and threatening public health. Although the disruption of the nitrogen cycle has largely failed to attract the sweeping public attention accorded to other global pollutants, such as chlorofluorcarbons that fray the Antarctic ozone layer and carbon dioxide that spurs global warming, ecologists say that nitrogen's impacts are at least as great. “In terms of effects, [nitrogen] is way up there,” says Stanford ecologist Peter Vitousek.

    Yet control efforts are lagging, according to some 400 experts who met here last month to ponder how to plug nitrogen leaks while still feeding and powering the world.* “I think people haven't recognized the global nature of nitrogen and that human activity is altering the nitrogen cycle,” says Galloway, a meeting co-chair. One problem is that leaks stem from diverse sources, and the effects are many and varied, including ozone and soot air pollution, harm to forests from acid rain, oxygen-depleted coastal waters, and the loss of biodiversity.

    To help stem the flood, attendees—who ranged from agronomists to economists to energy analysts—called for more integrated policies that address the entire nitrogen cycle, including the creation of an international scientific body for nitrogen. A similar body, the Intergovernmental Panel on Climate Change, eventually led to the Kyoto global warming treaty. Participants also swapped ideas for more targeted, technological fixes, including solutions for developing countries, where producing food often takes priority over cleaning up the environment.

    The current nitrogen glut stems largely from one of the greatest achievements of modern science: synthetic fertilizer. The 1913 discovery of the Haber-Bosch process, still used today to convert inert N2 gas and hydrogen to ammonia (a reactive form of nitrogen that plants can use), spurred a leap in global crop yields. But even by 1970, a few researchers foresaw a potential downside, as the amount of reactive, or fixed, nitrogen released from fertilizer alone approached the amount fixed naturally on land, with unknown environmental consequences.

    Today humans produce about 150 teragrams of fixed nitrogen per year—1.5 times the natural terrestrial amount (see graph). Fertilizers are the chief culprit, although nitrogen oxides (NOx) from fossil fuel combustion constitute roughly 25% of total sources. All this reactive nitrogen then cycles from one polluting form to another: nitric acid, which causes acid rain; nitrates and other compounds in waterways, blamed for oxygen-depleted coastal waters; urban ozone and soot particles, which endanger respiratory health; and N2O, a potent greenhouse gas. Extra nitrogen is also likely lowering biodiversity by shifting the composition of plant ecosystems. “Once you break that triple bond [in inert N2], that N atom stays reactive for a very long time and then cascades through the environment” before microbes finally convert the nitrogen back to N2, says Galloway.


    Output of anthropogenic fixed nitrogen is still soaring and now far outstrips the natural terrestrial amount, leading to a host of environmental problems.


    Unfortunately, existing regulations have largely zeroed in on one pollutant or another rather than tackling the entire nitrogen cycle, notes forest pathologist Ellis Cowling of North Carolina State University in Raleigh, a conference co-chair. Even strict policies in nitrogen-choked countries such as the Netherlands have failed to work as well as hoped, because nitrogen pollution easily crosses national borders, and curbing one source can cause it to pop up elsewhere.

    Yet on the NOx front, at least, speakers at the meeting sounded a cautiously optimistic note. The U.S. Congress took aim at NOx with 1990 amendments to the Clean Air Act, and levels have stabilized. More reductions should come as requirements for cleaner burning diesel engines and sport utility vehicles kick in over the next few years. In Europe, a multipollutant treaty called the 1999 Gothenburg Protocol is expected to cut NOx levels more than 40% by 2010. And the Bush Administration has said it expects to propose a plan that caps NOx and other pollutants and allows power plants to trade pollution “permits” in a market system, as they now do with sulfur.

    But far less progress has been made in curbing the nitrate and ammonia leaking from farmers' fields and animal waste. About 60% of sampled streams in the United States have total nitrogen levels above the background level of 1 mg per liter, and in Europe groundwater has even worse contamination, speakers noted. The problem is particularly acute in Asia, where growing crops to feed the population is a top priority. Synthetic fertilizer use there took off in the 1970s, and Asia now contributes 35% of the world's total synthetic nitrogen; its output is expected to double by 2030 to 100 teragrams of nitrogen per year, noted Congbin Fu of the Chinese Academy of Sciences. But specific measures could help minimize fertilizer use in Asia, suggested Rabindra Roy of the United Nations Food and Agriculture Organization, such as developing a simple machine to implant cakes of fertilizer deep in the soil and avoid runoff. Meanwhile U.S. officials are working on new policies, including a proposed rule to crack down on runoff from farms into rivers.

    Crop plants aren't the only culprits: 40% of the world's grain goes to feed livestock, which produces mountains of nitrogen-rich manure. The Netherlands illustrates the challenges: There, strict rules since 1986 have required steps such as keeping manure covered or plowing it into fields. But nitrogen still escapes, wafting off the fields as ammonia or washing into streams, says Jan Willem Erisman of the Netherlands Energy Research Foundation: “You can put it in the soil, but it comes out anyway.”

    The best way to address that problem—other than decreasing meat consumption—would be to reduce the amount of nitrogen animals release in the first place by feeding them a precise amino acid ratio, said Henry Tyrrell, an animal scientist at the U.S. Department of Agriculture; it's unneeded proteins that wind up as the nitrogen-rich urea in an animal's urine. But such precision feeding would be expensive.

    Indeed, cost is a factor in many of the technological fixes suggested at the meeting, such as capturing NOx produced by power plants for fertilizer. As long as energy prices stay low, making synthetic fertilizer will always be less costly, researchers said. “It's much too cheap,” Erisman says. Confronting the unyielding economics of the Haber-Bosch process may be policy-makers' biggest challenge.


    Smell's Course Is Predetermined

    1. Marcia Barinaga

    New studies suggest a high level of hardwiring in the olfactory system, perhaps explaining instinctual reactions to some scents

    From the aroma of coffee enticing us to rise and shine to the alarm we feel at the acrid smell of smoke, our brains are constantly responding to odors. The brain can differentiate the smells of thousands of chemicals, should they happen to waft into our noses and tickle sensory neurons located there. But how those neurons pass information to the brain, and how the brain processes it to discern, say, the scent of an orange from that of a tangerine, has remained a mystery. Two papers in the 8 November issue of Nature provide new insights into how the brain is structured for sorting out smells. In one study, Linda Buck and her colleagues at Harvard Medical School in Boston traced the connections of odor-responsive neurons in the brains of mice, providing the first glimpse of how the olfactory cortex, the part of the brain that processes odors, organizes incoming signals. In separate work, Liqun Luo's team at Stanford University revealed how links form between odor-responsive neurons and the brain.

    Both papers suggest that the olfactory system is highly genetically programmed, or hardwired, perhaps more so than vision and other sensory systems, says olfaction researcher Leslie Vosshall of Rockefeller University in New York City. And that may help smells to directly trigger instinctual behavior, some researchers speculate.

    The sensation of a smell begins when odor molecules bind to receptor proteins on sensory neurons in the nose. Mice have millions of sensory neurons, each bearing one of about 1000 different types of receptors on its surface. Those neurons send extensions called axons to a brain area called the olfactory bulb. Whereas neurons that bear the same receptor type are scattered rather randomly inside the nose, their axons sort out neatly, converging in each half of the bulb onto one or two structures called glomeruli, each of which receives input from only one receptor type. Most smells consist of multiple odor chemicals; the receptors they trigger send signals to a small subset of the 2000 or so glomeruli in the olfactory bulb.

    But what happens to the signals at the next step—the olfactory cortex—has been unclear. “The olfactory cortex has been terra incognita,” says neuroscientist Lawrence Katz of Duke University Medical Center in Durham, North Carolina. Some researchers have speculated that the connections linking the olfactory bulb to the cortex may be random and that the brain may learn to recognize the pattern each odor produces. Others suspected that there must be some predetermined order in the olfactory cortex; now Buck's team has found that to be true.

    Connect the dots.

    Buck's team traced the connections of sensory neurons bearing two different olfactory receptors (yellow and pink) from sites in the nose (left), to one glomerulus on each side of the olfactory bulb (near the nose), to multiple spots in the olfactory cortex.


    The team discovered the organization of connections by marking the pathway of neurons that receive signals from an individual type of odor receptor. Zhihua Zou and Lisa Horowitz in Buck's lab inserted the gene for a marker protein called barley lectin next to the gene for an olfactory receptor protein in mice. Neurons bearing that receptor made barley lectin and transferred the marker to connecting neurons, enabling the team to trace the olfactory neurons' connections through the olfactory bulb to the olfactory cortex.

    The researchers did this with two odor receptor genes. In each case, they found one or two stained glomeruli on each side of the olfactory bulb, as expected, and multiple clusters of stained neurons in the olfactory cortex. And the pattern of staining was clearly not random: For each receptor type the pattern was the same in all the mice. That shows for the first time “that there is order in the projection from the bulb to the cortex,” Katz says.

    What's more, the data suggest that pathways from different receptors converge in the cortex, an important feature if their signals are to be compared and processed. Buck's group deduced this because in the olfactory bulb, each receptor is represented in 1 out of 1000 glomeruli in each half of the bulb, or 0.1% of the bulb's area. But when a receptor's projections reach the cortex, they occupy about 5% of the area, suggesting that individual olfactory cortex neurons receive input from up to 50 different olfactory receptors. “Those signals have to be brought together,” says Harvard olfaction researcher Catherine Dulac, to enable the brain to analyze and distinguish among the patterns triggered by different smells.


    Stained projection neurons (green) in a fruit fly brain connect to glomeruli (bright green, foreground) and to higher brain areas (fainter green lines to the left).


    Luo's team, meanwhile, has filled in information about how neurons form the links that carry olfactory signals. In fruit flies, so-called “projection neurons” pick up the signal in the glomeruli and carry it to the fly's equivalent of the olfactory cortex. But what determines which connections those neurons make? The neurons might have no special instructions but wait for sensory input to guide them, as happens in the visual cortex. Alternatively, instructions may be programmed into the projection neurons from their birth—which is what Luo's team found.

    The 150 or so projection neurons are born sequentially from cells called neuroblasts. Graduate students Greg Jefferis and Lisa Marin marked projection neurons, using a shock of heat to turn on a marker gene in only those neurons being born during the heat shock. By varying the timing of the heat shock, the researchers could label different neurons in the birth sequence.

    The researchers heat-shocked thousands of fly larvae at various times in development. They found that the neurons that made connections with particular glomeruli always seemed to be born at roughly the same time. That suggested that, from the time a projection neuron is born, it knows which glomerulus to connect to. In a separate analysis, the team labeled neuroblasts, which then transferred the label to all of their progeny born after that time. In flies labeled early, a full set of glomeruli showed up. As the labeling time moved later, one by one the marked glomeruli disappeared. “They drop out in a defined sequence,” says Luo. “There is no exception.” The two analyses, Katz states, provide “absolutely compelling” evidence that the projection neurons get their marching orders at birth, based on when during development the neuron emerges.

    Indeed, the picture drawn by both papers is that “the organization of this very complicated sensory system seems to be highly predetermined,” says Rockefeller's Vosshall. And to many researchers, that makes a lot of sense. Smells are tightly linked to many instincts. If a young mouse had to learn from experience that the smell of coyote urine means danger, Vosshall notes, it may not get a second chance to use that information. It would be much more adaptive, she argues, for the coyote-urine smell to be hardwired into the animal's fear center by natural selection.


    A Fertile Mind on Wildlife Conservation's Front Lines

    1. Gretchen Vogel

    Renowned for getting captive elephants to breed, Thomas Hildebrandt and his team are extending their prowess to scores of other rare species

    BERLIN—Few people in the world are as intimately familiar with the elephant cervix as Thomas Hildebrandt. That particular knowledge may not make the 37-year-old veterinarian a hit at a cocktail party, but it has won him admirers in zoos around the world. Hildebrandt and his colleagues at the Institute for Zoo Biology and Wildlife Research (IZW) in Berlin have pioneered an innovative approach—artificial insemination (AI) technology aided by ultrasound—to impregnate half a dozen elephants, raising hope that zoos will be able to maintain their populations of these hard-to-breed creatures.

    Hildebrandt's ultrasound techniques “have revolutionized our ability to assess elephant reproductive health,” says Janine Brown of the Smithsonian National Zoological Park in Washington, D.C., where an elephant named Shanthi is expected to give birth next month—thanks to the “Berlin Boys,” as the IZW team is known.

    These days, the team's prowess at making babies is in high demand. Scores of zoo elephants are nearing the end of their life-spans, while recent laws have made it nearly impossible to import endangered species from the wild. Without AI, say several experts, zoo elephants could all but disappear within 2 decades. Other rare animals are also falling for the charms of these high-tech love doctors. Hildebrandt and colleagues Frank Göritz and Robert Hermes have used their ultrasound instruments to probe the internal organs of more than 200 species, including giant pandas, Komodo dragons, endangered European brown hares, and even invertebrates: Hildebrandt has used his technique to sex an octopus at Washington's National Zoo. “He'll ultrasound just about anything that lives or crawls,” says Richard Montali, head of the National Zoo's pathology department.

    The face of elephant husbandry.

    Thomas Hildebrandt and his probe have become intimate with the reproductive systems of animals ranging from Komodo dragons to octopuses.


    The team is currently working its charms on the rare white rhinoceros. This rhino has been hunted nearly to extinction for its horn, rumored to have aphrodisiac powers. Such power has sadly eluded white rhinos in captivity, where they have managed to breed successfully only a handful of times. The IZW researchers have pioneered techniques for both sperm collection and artificial insemination in the rhino, and they are waiting to hear whether an animal that they treated last month is pregnant. The Berlin Boys “are a sterling example of how science can work to the benefit of endangered species,” says Michael Keele of the Oregon Zoo in Portland, coordinator of the national species survival program for elephants in the United States.

    Key to the team's success is ultrasonic imaging, used in human medicine to visualize organs without penetrating the skin. A probe emits high-frequency sound waves, which tissues absorb or reflect to different degrees depending on their density. Although technicians can image a fetus by moving the probe outside a woman's abdomen, an elephant's skin and muscle layers are so thick that the sound waves cannot penetrate to the internal organs. Researchers can glimpse the animal's reproductive tract only from a closer vantage point: inside the digestive tract. Donning a shoulder-length glove, one of the team members inserts the specially designed ultrasound probe through the rectum and more than a meter into an elephant's colon and points it toward the reproductive organs. Although endoscopy exams are notoriously uncomfortable for people, most animals, if distracted by food, tolerate the procedure well.

    The sonogram is displayed on a TV monitor and inside a special helmet Hildebrandt designed to follow the probe even from a contortionist's position behind a 6-ton beast. To the untrained eye, the images seem an abstract blur, but years of painstaking correlation between the fuzzy shapes and postmortem dissections have allowed the IZW team to spot even subtle abnormalities. Less than a decade ago, says Montali, ovaries “were obscure structures you could only see in a dead elephant.” Using ultrasound, however, the IZW team has diagnosed ovarian cysts and uterine polyps and tumors that have prevented females from reproducing. It turns out that such abnormalities “occur with a frightening regularity” in captive elephants, says Keele. Hildebrandt estimates that one in four females has some sort of reproductive pathology. He suspects that allowing females to ovulate for many years without a pregnancy might be part of the problem. In the wild, he notes, female elephants spend most of their adult lives either pregnant or nursing a calf and produce very different levels of some hormones than females in captivity do.

    But female physical maladies explain only part of captive elephants' low fertility rate. Bulls housed together, for example, tend to have poorer quality semen than do bulls kept alone or with females. Indeed, scientists are just beginning to unravel the complex biological phenomena that may doom zoo populations of elephants.

    Where no man has gone before.

    The Berlin Boys examine an elephant's ovaries; their talents produced Abu (inset) in Vienna last April.


    Swimming upstream. When the Berlin Wall opened on 9 November 1989, tens of thousands of East Berliners streamed toward the celebrations at the Brandenburg Gate. But Hildebrandt was urgently bicycling the opposite way toward the Tierpark, East Berlin's zoo. That evening the IZW graduate student needed to give a hormone injection to a rare yak to try to induce it to produce multiple fertile eggs—and thanks to the timely injection, it did. The end of the Cold War ushered in many more opportunities for Hildebrandt, who stayed on at the IZW, one of the few institutes to emerge unscathed from the reorganization of eastern German science after reunification.

    The experiment that changed Hildebrandt's life, meanwhile, had begun a year earlier. He and his colleagues had been attempting to perfect an ovary transfer technique that required them to perform surgery on fetal goats. Often the trauma of surgery would cause a doe to abort, so the team sought the assistance of doctors who use ultrasound to perform minimally invasive surgery on humans. Hildebrandt quickly saw the technique's broader potential. “I was fascinated with this tool that would let us explore internal organs,” he says. He first tried the technique to image the reproductive tract of macaque monkeys but soon wondered if it might be useful for use with heftier beasts too.

    Hildebrandt was especially eager to try his hand with elephants. But the technical challenges of imaging from the inside of such a large animal were daunting. He had to design special probes, now patented, which could be maneuvered into the animal's rectum and aimed in the right direction. It took more than 2 years for the IZW group to perfect the procedure.

    From the start, Hildebrandt hoped ultrasound might aid efforts to artificially inseminate elephants. AI is a potential boon to elephant keepers, for an obvious reason: It is much easier to ship sperm than an elephant. The procedure has long been used to breed cattle and other livestock, but for decades no one had gotten it to work in elephants. Again, technical difficulties hindered the attempts: Ideally, the sperm should be deposited inside the 2-centimeter-wide elephant cervix, which is situated considerably more than an arm's length—more than 1 meter—from the opening of the so-called vestibule, a combined urinary-genital opening that prevents a straight shot to the cervix.

    But Hildebrandt's new ultrasound tools, coupled with a better understanding of the estrus cycle of female elephants, enabled two groups to succeed. In 1997, Dennis Schmitt of Southwest Missouri State University and the Dickerson Park Zoo in Springfield, Missouri, announced the first successful use of AI on an Asian elephant. A few months later, the Berlin researchers confirmed that their AI procedure had impregnated a 20-year-old African female at the Indianapolis Zoo. Two years later, both animals gave birth to healthy babies.

    Elephant AI is no easy feat. The sperm often doesn't survive freezing and thawing, so the process is carefully choreographed. When monitoring of a female's hormones and ovaries shows she is fertile, a team member travels to the chosen bull to collect fresh sperm and then jets off to the recipient female. Once the sperm arrives, the team uses the ultrasound probe to monitor the placement of a catheter—attached to a custom-designed endoscope—on its long journey to the cervix to deposit the sperm.

    The team members enjoy the unusual profile that comes with the job. “You can imagine the response when the flight attendant asks, ‘What's in the special case?’ and the answer is, ‘Elephant semen,’” says Harald Schwammer, vice director of the Vienna Zoo. There, the IZW team's AI techniques led to the birth of a healthy bull last spring.

    The Berlin Boys happily share their AI legerdemain, spending nearly half the year on the road. This month they're in Tanzania, collecting sperm from wild elephant bulls for cryopreservation experiments and using ultrasound to check the reproductive health of elephants in Selous National Park.

    If their research on sperm preservation bears fruit, it would give a huge boost to the prospects of the captive African elephant population, enabling managers to supplement the shrinking gene pool with “wild-caught” sperm. But the work in zoos also benefits wild populations, Hildebrandt says. Elephants hold our fascination as few other species do, so they play a charismatic role in teaching the public about conservation, he says. And attempts at conservation without understanding each species' reproductive pattern are doomed to fail—a lesson from Hildebrandt's own unique brand of intimacy.


    Japan Looks for Bright Answers to Energy Needs

    1. Dennis Normile

    A massive solar array to beam energy back to Earth is still a dream, but feasibility studies are moving ahead on several continents

    UJI, JAPAN—The bank of halogen lights mounted on a rack in a nondescript laboratory building here at Kyoto University isn't there to illuminate the room. Instead, it's part of a technology that space radio scientist Hiroshi Matsumoto believes will eventually light up the world. After years of neglect, the idea is also starting to shine more brightly in the eyes of government officials in Japan and elsewhere around the world.

    Demonstrating the technology to a visiting reporter, Matsumoto switches on the light, which strikes an array of photovoltaic cells. The generated electricity is converted to microwaves, which are transmitted across a meter or so of air to collectors. They convert the energy back into electricity that powers up tiny light-emitting diodes. “The technology demonstrably works,” says Matsumoto, who dreams of several square kilometers of solar panels in a geostationary orbit beaming back energy for use on Earth. Such a scheme, he says, “is the only way we can guarantee the energy needed to support a steady increase in living standards for the world's growing population.”

    This science fiction-like vision has been around for decades, and it enjoyed a brief day in the sun during the oil shortages of the 1970s. But it's only recently that the giggles of skeptical colleagues are being replaced by signs of respect—and money—from government agencies.

    Japan's National Space Development Agency (NASDA) earlier this year commissioned two industrial groups to develop competing proposals for a space solar power test satellite that could be launched within this decade. “The agency and the aerospace industry are extremely interested in this project,” says Masahiro Mori, NASDA's space solar program manager. Japan's Ministry of Economy, Trade, and Industry (METI) is separately funding a study of a commercial space solar power plant that is expected to trigger additional funding. “We consider this a possible future energy option,” says Junya Nishimoto of METI's space industries office. And this year Matsumoto received $3.5 million from the Ministry of Education, Culture, Sports, Science and Technology to build a facility to test microwave antennas and receivers. Matsumoto estimates that in the last decade Japan has spent $10 million to $20 million, not including salaries, on both wireless power transmission and space solar power.

    Interest in space solar power is picking up in other countries as well. The French space agency, CNES, is watching the installation of the first operational wireless power transmission system on remote Réunion Island in the Indian Ocean for clues about its use as a power source for robots on Mars or the moon. In September, a committee of the U.S. National Research Council (NRC) said that a fledgling NASA program on space solar power deserves at least enough funding to do some serious research. “I am confident that with this positive peer review we will be able to move ahead with this research,” says John Mankins, director of the NASA program, which has spent $22 million in the past 2 years.

    Powerful idea.

    Hiroshi Matsumoto has built a prototype of an orbiting solar power system in his laboratory at Kyoto University.


    Some doubt that space solar power will ever prove economically competitive for terrestrial use. “The tasks are formidable, and [at present] it's not clear that you can identify a path that you know will solve the technology problems,” says Richard Schwartz, an electrical engineer and dean of engineering at Purdue University in West Lafayette, Indiana, who chaired the NRC panel. Schwartz, who remains neutral on the question of putting a solar power plant in space, nevertheless believes that an increased investment could bolster work on photovoltaics, robotics, and wireless power transmission.

    The concept of generating power from space goes back to Nikola Tesla, who in 1899 tried unsuccessfully to illuminate isolated homes by beaming energy from a tower set up in Colorado Springs, Colorado. In the 1960s, William Brown, an engineer at Raytheon Co., powered a small helicopter hovering above a transmitting array, and Peter Glaser, a mechanical engineer in Boston, proposed the idea of space-based solar arrays beaming energy to Earth (Science, 22 November 1968, p. 857). Since then, scientists have tried to show that the physical constraints are not insurmountable. In 1983 and again in 1993, Japan's Institute of Space and Astronautical Science transmitted microwave beams from one rocket to another, confirming that atmospheric scattering is negligible. It has also been shown that microwaves below 10 gigahertz suffer minimal damping from atmospheric water vapor.

    Although the NRC committee agrees with Matsumoto that the basic concept has been proven, it noted that “providing space solar power for commercially competitive terrestrial electric power will require breakthrough advances in a number of technologies.” There's also the problem of getting the necessary equipment into space. Both NASA and NASDA have programs to develop low-cost launch technologies based on either reusable rockets or inexpensive expendable rockets. Both will probably be needed to build a workable power grid: The NRC committee estimates that it would take 1000 space shuttle payloads to deliver the necessary material, an order of magnitude more than the number of missions needed to construct the international space station. Without breakthroughs in launching technology, space solar power “would be impractical and uneconomical for the generation of terrestrial base load power due to the high cost and mass of the components and construction,” the NRC report concludes.

    But enthusiasts see that long list of challenges as a rallying cry, not a signal to retreat. “It may take 2000 years, but humans are destined to civilize space,” says Matsumoto, who also chairs NASDA's space solar power advisory committee and is an adviser to the METI program. “And this will be one of the enabling technologies.”

Log in to view full text

Log in through your institution

Log in through your institution