News this Week

Science  16 Nov 2001:
Vol. 294, Issue 5546, pp. 1430
  1. U.S. BUDGET

    NSF Research Bounces Back; Congress Funds New Facilities

    1. Jeffrey Mervis

    Congress last week pulled the 2002 budget for the National Science Foundation (NSF) back from the brink of fiscal calamity. By providing a 7.7% increase for NSF's $3.6 billion research account, legislators more than wiped out a 0.5% cut proposed by President George W. Bush in April (Science, 13 April, p. 182) and gave a green light to several NSF initiatives. At the same time, lawmakers accepted with minor tinkering the president's plans for NSF's education programs, boosting them 11% to $875 million.

    “I'm really, really pleased with our numbers for 2002,” said the visibly relieved NSF director Rita Colwell, whose campaign to double the agency's budget, to $8 billion, took a sharp hit from the incoming Bush Administration. “Given where we started, we didn't do so badly,” added a senior NSF official. NSF also largely avoided the earmarks—pet projects of individual legislators—that are spread liberally across the rest of a $113 billion bill covering housing, veterans affairs, NASA (see sidebar), and dozens of other agencies.

    Winning numbers.

    Director Rita Colwell likes the boost in NSF's 2002 budget.


    Within the research programs, the foundation's cross-disciplinary initiatives fared well, with information technology and nanotechnology each getting $25 million more than the request, to $297 million and $198 million, respectively. The increases for individual directorates varied from nearly 9% for the geosciences and engineering to only 3% for the social and behavioral sciences. Although Congress declined to specify a number, it told NSF to give “a high priority” to a much-touted mathematics initiative for which the agency had requested $20 million.

    The spending bill also overrode the Administration's desire to block any major new research facilities (Science, 27 July, p. 586). In addition to approving requests for terascale computing and an earthquake engineering simulation network, legislators added $15 million to start a 1-square-kilometer neutrino array under the South Pole, $12.5 million to begin the Atacama Large Millimeter Array in Chile, and $35 million to continue building a high-altitude plane to carry out atmospheric studies.

    Showing their traditional support for education, legislators told NSF to hike annual stipends for three graduate research and teaching fellowship programs from $18,000 to $21,500. That's $1000 above the agency's request and in line with Colwell's goal of making them more competitive (Science, 30 March, p. 2535).

    Appropriators also took the rare step of committing hard cash—$5 million apiece—for two proposals to bolster undergraduate science and engineering that are still working their way through Congress. One, for scholarships to undergraduates who agree to be science teachers, is described in a bill (H.R. 1858) by Representative Sherwood Boehlert (R-NY), chair of the House Science Committee; the other (S. 1549), championed by Senator Joe Lieberman (D-CT), would reward universities that promise to produce more science and engineering majors. The Administration's new science and math partnerships program to link universities and local public schools was funded at $160 million rather than the $200 million requested.

  2. U.S. BUDGET

    Pluto and Pork Win Out at NASA

    1. Andrew Lawler

    At first glance, NASA's $14.8 billion budget for 2002 appears to mark a modest victory for an agency struggling with cost overruns and a leadership vacuum. Some of the 3.8% boost approved last week by Congress, for example, will kick off a long-awaited mission to Pluto, and additional money goes to boosting support for a nascent effort to study the sun.

    But hidden in the fine print is $532 million for 136 projects put forward by individual legislators—nearly double last year's total and the largest amount of earmarks in the agency's history. That smorgasbord of pork leaves NASA with an increase of a scant $8 million and little flexibility to cope with a $75 million cut in the space station budget. Although this year's science programs won't be affected directly, agency managers say that overseeing the 2002 budget could be a fiscal nightmare for the successor to NASA Administrator Dan Goldin, who steps down this week.

    Long journey.

    Pluto mission needs outyear funding before it can fly.


    The $30 million payment for a 2006 Pluto mission comes after more than a year of lobbying by enthusiasts and scientists. “The people let Congress know that they want NASA to explore Pluto, … and Congress responded,” says Louis Friedman, executive director of the Pasadena, California-based Planetary Society.

    But there are two catches: First, Congress approved funding for only a single year, with no commitment beyond 2002. “It doesn't take a rocket scientist to see the problem there,” says space science chief Ed Weiler, who by 1 December is expected to choose a contractor to build the craft. Second, the mission requires a radioisotope electrical generator, but the only two on tap are penciled in for a 2008 mission to Jupiter's moon Europa. Building a third generator would be expensive, Weiler adds.

    Congress also capped the Europa mission at $1 billion, a figure that leaves little room for the cost of shielding the craft against harsh radiation. But it agreed to let the Jet Propulsion Laboratory in Pasadena, California, run the mission, after some had wanted an open competition. Lawmakers also provided an extra $10 million for the Living With a Star program to study the sun, and funds for advanced propulsion research to benefit future planetary missions.


    Partners Protest U.S. Plans to Shrink Crew

    1. Andrew Lawler

    A U.S. plan to scale back the international space station has triggered a revolt among the project's foreign partners, who fear the cuts would ruin their research programs. Led by Canada, the partners are demanding a meeting with U.S. officials to try to reverse cost-saving moves that could leave the station with half its planned six-member crew.

    A scaled-back station “would virtually eliminate the partners' collective ability to use” the station, declares a diplomatic note sent on 1 November by Canada to the U.S. State Department. A three-person crew would limit Canada to “30 minutes per week—which is not enough time to conduct any meaningful science,” according to notes accompanying the message.

    The outcry came as key lawmakers and a senior White House official tacitly agreed at a 6 November congressional hearing to support—at least temporarily—a less capable station with half the planned crew. The idea was put forward earlier this month by a panel led by retired aerospace executive Thomas Young (Science, 9 November, p. 1264). Meanwhile, lawmakers last week also approved a 2002 space station budget that cuts $75 million from the station's $2 billion annual budget and leaves little room for additions to reach its planned design (see previous page).

    Goodbye Columbus?

    European scientists hope that U.S. cuts won't cramp their use of the Columbus lab module.


    The partners, whose allocation of research time is based on the size of their investment in the station, say that the idea for a less capable version was hatched without their participation. In its note, Canada proposes a meeting of all the station participants to express their concern. That meeting, which has the backing of European and Japanese officials, likely won't take place until January.

    The timing of the Young report is particularly awkward for the European Space Agency (ESA). Ministers from 15 states—including 11 that are members of the station effort—are meeting this week in Edinburgh to approve a $900 million research program for the station from 2002 through 2006. ESA's Columbus laboratory is slated to be launched at the end of 2004. But with routine maintenance and operations taking up more than 80% of the time of a three-member crew, there will be little opportunity for research. That's why the original plan to have six or seven astronauts on board is “an essential requirement,” says Hebert Diehl, chair of the European partners' coordinating committee, in a 2 November letter to a senior State Department official.

    Japanese officials were working on a similar statement. Japan is preparing to launch a laboratory module that will require a significant crew to conduct experiments in a wide range of materials and life sciences areas.

    U.S. officials testifying last week assured Congress that the cuts need not affect the station's ultimate role as a research platform. The Young plan is “a good course of action” that the Bush Administration could endorse, says Sean O'Keefe, deputy director of the Office of Management and Budget. If NASA comes up with a credible way to complete the original version in the next 2 years, he added, “then there will be no diminution” of the space station's capability. House Science Committee Chair Sherwood Boehlert (R-NY) also expressed cautious support for the Young panel's suggestions, noting later that Congress was “in no mood” to boost spending on the station until NASA demonstrates better management oversight.


    A Variable Sun Paces Millennial Climate

    1. Richard A. Kerr

    Most scientists have long assumed that the sun shone steadily, its unvarying brightness the one constant in a climate system that seemed to lurch willy-nilly from one extreme to another over the millennia. From time to time, a few brave souls would suggest that the sun actually waxes and wanes with a steady beat, driving earthly weather or climate in predictable cycles. But the proposed correlations between sun and climate would usually collapse under closer scrutiny. Now, prospects are brightening for the putative connection between a varying sun and climate change on the scale of millennia.

    In a paper published online this week by Science (, paleoceanographer Gerard Bond of the Lamont-Doherty Earth Observatory in Palisades, New York, and his colleagues report that the climate of the northern North Atlantic has warmed and cooled nine times in the past 12,000 years in step with the waxing and waning of the sun. “It really looks like the sun has mattered to climate,” says glaciologist Richard Alley of Pennsylvania State University, University Park. “The Bond et al. data are sufficiently convincing that [solar variability] is now the leading hypothesis” to explain the roughly 1500-year oscillation of climate seen since the last ice age, including the Little Ice Age of the 17th century, says Alley. The sun could also add to the greenhouse warming of the next few centuries.

    The new sun-climate correlation rests on a rare combination of long, continuous, and highly detailed records of both changing climate and solar activity. The climate record is a newly enhanced version of Bond's laborious accounting of microscopic rocky debris dropped on the floor of the northern North Atlantic during cold periods (Science, 25 June 1999, p. 2069). Ice on or around Canada, Greenland, and Iceland picked up these bits of rock and then floated into the North Atlantic, where the ice melted and dropped its load of debris. Bond and colleagues had found that the debris jumped in abundance every 1500 years (give or take half a millennium) as the ice surged farther out into a temporarily colder Atlantic. During spells of exceptional cold in the last ice age, huge amounts of ice crossed the Atlantic as far as Ireland, but even during the current warm interglacial interval a weaker millennial climate pulse continued across the Atlantic.

    Well-matched wiggles.

    The synchroneity of fluctuations in ice-borne debris (black) and carbon-14 (blue) suggests that a varying sun can cause millennial climate change.


    Records of solar activity are found in both carbon-14 in tree rings and beryllium-10 in cores of Greenland ice. Both isotopes are the products of cosmic rays striking the upper atmosphere. The solar wind of a brighter and more active sun would magnetically fend off more cosmic rays, decreasing production of carbon-14 and beryllium-10. Trees take up the carbon-14 to add new growth rings, and the beryllium-10 falls on Greenland snow that then forms annual ice layers.

    The test for a sun-climate connection comes when the two types of records are put together for comparison. The more in synch the sun and climate are, the more it looks like the sun is driving climate change. “It's a strong result,” says Bond. “You can do statistics on it,” but what really persuades him is “what you see” in a plot of the two records: the close match between the peaks and troughs of the climate record and those of the solar record. Simple analysis gives correlation coefficients between 0.4 and 0.6. “That's a very high correlation” for separate geologic records, says geophysicist Jeffrey Park of Yale University. “It's not on the margin. It shows that the connection is real.” Time series analyst David Thomson, soon to be at Queen's University in Kingston, Ontario, agrees that the statistics are good, “but their experiment may be good enough even without statistics. I think they've got a fairly convincing case.”

    As warm as the reception for a sun-millennial climate link may be, researchers caution that much is left to be sorted out. “It remains a little hard to figure out exactly how the sun has mattered to [recent] climate,” says Alley, “and why it has mattered so much.” The dimming and brightening was too small to alter the climate directly with changes as dramatic as the Little Ice Age, especially in the high-latitude North Atlantic, where the chill seems to have been greatest.

    Alley points to growing evidence that solar variations can gain leverage on the atmosphere by altering the circulation in the stratosphere, which in turn changes the circulation below in the lower atmosphere (Science, 19 October, p. 494). Once near the surface, the solar influence might induce a change in ocean circulation. A self-sustained oscillation in the rate at which far northern North Atlantic waters sink into the deep sea had been the leading alternative to a solar influence. Given the cooling pattern they find across the Atlantic during the last few cycles, Bond and his colleagues suggest that deep-water formation does in fact oscillate, but the timing of the oscillation would be influenced by the now seemingly inconstant sun.


    Neutrino Oddity Sends News of the Weak

    1. Charles Seife

    Physicists are excited, once again, about a potential conflict with the Standard Model of Particle Physics. Measurements of the behavior of neutrinos, made by a team at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, suggest that the Standard Model may misgauge the strength of one of the fundamental forces of nature. Although not conclusive, the results might signify an undiscovered particle—or an experimental fluke.

    Particle trap.

    This giant detector at Fermilab gathered puzzling data on neutrinos.


    The Fermilab experiment measured θW (“theta-sub-w”), a quantity called the weak mixing angle. Although not an angle in the ordinary sense, θW smells like one to a mathematician. Roughly speaking, it measures the relation between the electromagnetic and weak forces: Different values of θW yield different pictures about the relative strengths of the forces at different energies. Unlike a similar-sounding quantity called the neutrino mixing angle, which determines the properties of neutrinos (Science, 2 November, p. 987), θW measures a fundamental force of nature, something that is fully accounted for in the Standard Model.

    So when the Fermilab researchers measured θW using neutrinos produced by the Tevatron accelerator, they didn't expect to see anything unusual. The Tevatron produced powerful protons, then slammed them into a beryllium-oxide target, producing kaons and pions with various charges. Using magnets, the scientists sifted these particles, picking out varieties that would decay and produce either neutrinos or antineutrinos. They then compared how the resulting neutrinos and antineutrinos interacted with a 700-ton steel detector. The neutrinos and antineutrinos have different spin states and thus are affected differently by the weak force—and θW. By comparing the neutrinos' behavior with that of the antineutrinos, the team figured out the size of θW.

    The result surprised them. The measured value of θW disagreed with what the Standard Model predicts by three standard deviations—“three sigma.” “A three-sigma result is interesting; it gets people's attention,” says Kevin McFarland, a physicist at the University of Rochester in New York state and member of the Fermilab team. In particle physics, such a result is usually considered provocative but not ironclad. But McFarland is sanguine. “I spent the last 8 years of my career making one measurement,” he says, and after thorough checking and rechecking, the conflict with the Standard Model remained.

    If real, the anomaly might be caused by an undiscovered particle such as a hypothetical new carrier of the weak force called Z' (“Z-prime”), says Jens Erler, a physicist at the University of Pennsylvania in Philadelphia. “The [Fermilab] experiment is not explained by Z', but helped,” he says. When combined with another recent intriguing but inconclusive result in atomic physics, says Erler, it is “almost crying out for Z'.”

    But doubts will remain until new experiments can shed more light on the situation. “Three sigma can easily be a fluke,” says Erler. “But we take it seriously enough to have a really close look.”


    Single Gene Dictates Ant Society

    1. Constance Holden

    Genes regulating behavior are very hard to pinpoint; even basic behaviors are thought to be influenced by many genes interacting in mysterious ways. But fire ant researchers at the University of Georgia say they've characterized a gene that may single-handedly determine a complex social behavior: whether a colony will have one or many queens. The gene in question seems to work by controlling how ants perceive pheromones that tell them who's a queen and who isn't.

    The research “opens up for the first time the study of genes influencing social behavior across the whole span of the biological hierarchy,” says Andrew Bourke of the Zoological Society of London, “from the most basic, molecular level, through the level of individual behavior, right up to the social level.” The findings, by biologists Michael Krieger and Kenneth Ross of the University of Georgia in Athens, are published online this week by Science (

    Full pheromone power.

    Monogyne workers tending their queen possibly have more sensitive pheromone receptors than do some polygynes.


    Fire ants have two basic kinds of social organization. A so-called monogyne queen establishes an independent colony after going off on her mating flight, nourishing her eggs with her own fat reserves without worker help until they hatch and become workers themselves. Polygyne queens, in contrast, are not as robust, fat, and “queenly” as monogynes, says Krieger, and they need worker aid to set up new colonies. They spread by “budding” from one primary nest into a high-density network of interacting colonies. Monogyne communities permit only a single queen, and those with a resident royal kill off any intruding would-be queen. Polygyne colonies can contain anywhere from two to 200 queens and accept new queens from nearby nests.

    All these differences, the scientists suggest, depend on which version of a gene known as Gp-9 ants possess. It encodes a pheromone-binding protein that may be crucial for recognizing fellow fire ants. All ants in a monogyne colony have two copies of the B allele, but among the polygynes, at least 10% are heterozygous, carrying a mutant allele, b. Polygyne communities kill off any potential BB queens, but the Bb proportion of the colony somehow “persuades” the rest to accept Bb queens, says Krieger. They apparently escape attack because the Bb workers aren't very good at recognizing Bb queens. The b allele appears to code for a faulty protein, he says. He theorizes that this prevents the polygyne workers from detecting as many pheromones as the monogyne workers do, and this leads them to allow many young heterozygous queens to survive, whereas the more queenly BBs are easy to target.

    This is the first time scientists have nailed down the identity of “a single gene of major effect in complex social behavior,” the researchers claim in their paper. Until now, about the closest anyone has come to a social behavior gene is one that determines whether nematodes, a kind of parasitic worm, clump together when food is plentiful, says Krieger. But nematodes don't really count because they're not “social animals,” like bees or ants or primates, he claims.

    Some researchers already suspected that this gene was pivotal in determining the shape of fire ant society, but the notion was “so unexpected as to arouse skepticism,” says entomologist Ross Crozier of James Cook University in Townsville, Australia. Crozier says many scientists cling to the view that “genetic details are not important in studying evolutionary changes, because changes are due to many genes of small effect.” The fire ant story is “a significant new example” to the contrary.


    Multitasking Is This Plankton's Trademark

    1. Elizabeth Pennisi

    One of the organisms responsible for red tide, which can kill marine species and close down fisheries, is proving to be quite ingenious in how it manages its metabolism. Like many other so-called cyanobacteria, Trichodesmium processes nitrogen to make ammonia and also produces oxygen by means of photosynthesis. Usually these two activities are chemically incompatible, leading researchers to wonder what goes on inside this filamentous plankton. Now, after 40 years of speculation, researchers think they have figured out how Trichodesmium seemingly manages both processes at once.

    The work “is starting to peel back how [Trichodesmium] gets away with what it gets away with,” comments Douglas Capone, a microbial ecologist at the University of Southern California in Los Angeles. This organism is key to providing marine life with nitrogen, thereby stimulating ocean productivity, he adds.

    As reported on page 1534, this cyanobacterium carefully balances the amount of time it spends on photosynthesis and nitrogen fixation, shifting from one process to the other. “We're showing a decline in photosynthesis when nitrogen fixation is high,” says study co-author Ilana Berman-Frank, a phytoplankton ecologist at Rutgers University in New Brunswick, New Jersey. This prevents oxygen, a byproduct of photosynthesis, from damaging the nitrogen-fixing enzyme nitrogenase. Such damage precludes an organism from performing both processes at the same time or in the same cell.

    Room of its own.

    Fluorescent antibodies light up filaments' centers, revealing localized nitrogen fixation in Trichodesmium.


    Richard Dugdale, now at the Romberg Tiburon Center for Environmental Sciences at San Francisco State University, first discovered that Trichodesmium fixes nitrogen some 40 years ago. He brought a mass spectrometer on board a research ship to make field measurements of the plankton's metabolism. At the time, most microbiologists believed that nitrogen fixation occurred only at night or in specialized cells called heterocysts that don't make oxygen. Unlike other cyanobacteria, Trichodesmium lacks these cells, and skeptics ridiculed Dugdale's observations as “nitrogen fiction.”

    Over the years that fiction became fact, but still no one could figure out how this organism managed to fix both carbon (by means of photosynthesis) and nitrogen at the same time—and during the day at that.

    To find out, Berman-Frank and her colleagues used a technique called fast repetition rate fluorometry to track photosynthesis as it occurs by measuring the fluorescence patterns. They monitored nitrogenase activity at the same time. The experiment showed that the oxygen-producing enzymes worked all day, except for a several-hour-long midday siesta, during which nitrogenase activity was in full swing.

    Then Berman-Frank and her collaborators took an even closer look at what was going on inside individual cells. Hendrik Küppe of the Institute of Microbiology in Trebon, Czech Republic, used a customized microscope that tracked oxygen production by monitoring the changing fluorescence that occurs during photosynthesis.

    They found that cells could shut down photosynthesis within 15 minutes to allow nitrogen fixation to occur. They also found that this shutdown often occurred only in parts of the filament, often at their centers. Thus they think the cells have exquisite control of where and when these two processes go on.

    “These are very elegant experiments,” notes Jonathan Zehr, a microbial ecologist at the University of California, Santa Cruz. Adds Edward Carpenter, who is also at the Romberg Tiburon Center, the study “goes a long way to explain how the organism does this.”

    Because Trichodesmium is ancient compared to other cyanobacteria, Berman-Frank and her colleagues think that its mechanism for orchestrating photosynthesis and nitrogen fixation is a primitive one and that specialized cells came later. “This may be a missing link and a precursor to how cyanobacteria evolved,” agrees Capone. But Zehr isn't so sure. He thinks Trichodesmium species themselves could represent a highly specialized group of organisms that simply branched off early from other cyanobacteria. Nor is he convinced that this new work gets to the bottom of this paradox. “I don't think it's totally solving the riddle,” he says.


    Labs Tighten Security, Regardless of Need

    1. Joshua Gewolb

    Michael Walter does not keep anthrax in his laboratory. But since the nationwide anthrax scare began last month, the microbiologist at the University of Northern Iowa in Cedar Falls has installed new locks to safeguard strains of a harmless cousin, Bacillus cereus. “Everyone was giving me a hard time,” he says. “So I locked up some related strains … just to put people's minds at ease.”

    University researchers and administrators around the country say that the public's fear of bioterrorism has led them to increase security in labs that study anthrax—whether or not they keep bugs that could infect humans. The new precautions include police guards, security cameras, new locks, key-card identification systems, motion detectors, and remodeling that reduces the number of building entrances. Scientists say that although they don't think their labs were ever unsafe, more protection can't hurt, and the measures are good public relations.


    The most extreme measures may have been taken by Iowa Governor Thomas Vilsack, who called out the National Guard in response to false reports of a direct link between the state and the anthrax strain that killed four people. (The anthrax used in the attacks has been identified as the Ames strain, which was first cultured decades ago at Iowa State University in Ames and is now widely disseminated among research labs in the United States.) After the troops arrived, Iowa State microbiologists decided that keeping anthrax spores as part of their general bacteriological collection was more trouble than it was worth. So on 12 October they autoclaved the entire anthrax collection. “We made sure nobody needed the cultures and that they weren't of value to anyone, and then we proceeded to destroy them,” says Iowa State microbiologist Jim Roth.

    Many researchers say that security at anthrax labs has always been excellent. Microbiologist Rodney Tweten of the Oklahoma University Health Sciences Center in Oklahoma City says that he hasn't taken any extra precautions at his lab, because the cloned anthrax DNA that he studies could not harm anyone and because existing security is high.

    The costs of additional security can be sizable. Microbiologist Paul Keim of Northern Arizona University (NAU) in Flagstaff, who maintains live anthrax for his studies of the differences between strains, estimates that the university spent up to $50,000 last month on security upgrades. NAU vice provost Carl Fox says the lab is now monitored by security guards, new locks have been installed, and there's a wall where a door once stood.

    But the fallout from underestimating public fears also can be significant. Detroit television station WXYZ-TV this month aired a report questioning the security in the laboratory of pathologist James R. Baker, a so-called anthrax researcher at the University of Michigan, Ann Arbor, who has used the harmless Bacillus cereus in the past. When the TV crew members attempted to enter Baker's laboratory, they were rebuffed by a locked door and, later, challenged by lab personnel. But scenes of the reporter freely entering an adjacent laboratory, even though it is used to study hearing, left the impression that university labs were vulnerable. The video, which aired several times over 2 days, forced university public relations officials to work frantically to calm fears about campus security.

    Researchers who question the value of additional security point out that anthrax is widespread in soils. They also note that a potential terrorist breaking into an anthrax lab might not know what to look for. “It would take me at least a couple of hours to find anything, and I'm the dean,” says Michael Groves, who heads the Louisiana State University School of Veterinary Medicine in Baton Rouge, where police guard a building that houses anthrax cultures pending the installation of a new key-card identification system.

    Still, researchers say precautions are worthwhile if they reassure a jittery public. “The business of allaying fears is very important,” says Northern Iowa's Walter. And although some scientists say that the possibility of theft should not be discounted, most express confidence that their labs are secure. “Barring a SWAT team or someone with bazookas, I think we actually have a pretty safe situation for the cultures,” says Keim.


    Congress Weighs Select Agent Update

    1. Martin Enserink,
    2. David Malakoff

    U.S. researchers may soon be haggling with the government over which viruses, bacteria, and biological toxins should be tightly regulated. Congress this week was expected to begin debating a proposal to impose new security requirements on laboratories working with these pathogens and to update the government's list of about 40 regulated “select agents.” But experts say that it is unclear whether the core list—which the United States hopes other countries will adopt—should expand or shrink.

    The legislation, to be introduced by Senators Edward Kennedy (D-MA) and Bill Frist (R-TN), is the latest congressional response to the anthrax mail attacks that have killed four people in the United States. It follows a newly imposed ban on the possession of such agents by scientists from so-called terrorist nations. The latest proposal—some version of which is expected to become law within weeks—is intended to boost government spending on vaccines and strengthen the nation's defenses against bioterrorism. But it would also increase federal oversight by requiring greater lab security and registration of select agent collections and certain types of research equipment.

    Deadly addition.

    The Nipah virus would be a candidate for a new list of regulated bioagents.


    The debate over which agents to include would be triggered by language ordering the Department of Health and Human Services (HHS) to revise its select agent list every 2 years. Although periodic review is a good idea, say experts, the exercise is likely to be dogged by technical disagreements. “Coming up with the first list wasn't easy,” recalls Janet Shoemaker of the American Society for Microbiology, which in 1996 helped the Centers for Disease Control and Prevention (CDC) in Atlanta evaluate hundreds of candidates to comply with a law that requires registration for labs that ship or receive potential bioweapons. “No two people ever agree on what should be on these lists,” says David Franz, a former commander of the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland.

    Researchers involved in the evaluation say there was consensus on listing highly lethal organisms that are relatively easy to turn into weapons, such as smallpox, anthrax, plague, tularemia, and a number of viral hemorrhagic fevers. But other agents sparked debate. A draft list generated nearly 70 letters, and CDC responded by dropping agents such as Western equine encephalitis virus and a bacterium called Chlamydia psittaci and adding equine morbillivirus and a fungus called Coccidioides immitis. The current list contains 13 viruses or virus groups, nine bacteria, three rickettsiae species, a fungus, and 12 types of toxins (Science, 2 November, p. 971).

    One challenge for the CDC will be squaring its list with similar compilations by other bioweapons experts. Western equine encephalitis, for instance, is still listed as a potential threat by another CDC analysis. A loose consortium of 34 countries that works to limit the export of biothreats, called the Australia group, includes food- and waterborne diseases, such as salmonella and cholera, that are absent from the CDC list. The North Atlantic Treaty Organization, meanwhile, has its own list that includes dengue and influenza. These agents are not usually fatal, but they can bring an army to its knees. There are also extensive lists of potential agricultural threats, but Congress appears content to leave their regulation in the hands of the U.S. Department of Agriculture.

    Another problem is that the world doesn't stand still. “We learn new information all the time,” says Robert Shope, a virologist at the University of Texas Medical Branch in Galveston. In 1999, for instance, the newly identified Nipah virus killed more than 100 people in Malaysia and decimated its pork industry (Science, 16 April 1999, p. 407).

    Future listmakers must balance the benefits of being comprehensive against the costs of burdening law enforcement and research efforts, say bioterror experts. One option is to split the list into two classes, with the riskier agents—such as anthrax—subject to more stringent regulation. Administration officials have also floated the idea of setting up a new enforcement office within HHS to police microbe research, because CDC, a public health agency, has traditionally resisted that role. “We are not a regulatory agency and don't profess any expertise or much experience in that,” CDC head Jeffrey Koplan told reporters last week.

    The scope of the list will determine the number of researchers and laboratories affected. About 250 U.S. university, government, and private labs are registered to handle the agents on the current list. But CDC officials expect that number to grow because of a processing backlog and because many labs have been tardy in filing their paperwork. Scientists in other countries could be subject to similar systems if their governments follow the U.S. lead, and the United Kingdom has already introduced legislation to criminalize possession of certain bioagents. The Bush Administration supports that approach as a substitute for the proposed Biological and Toxin Weapons Convention protocol, now stalled.


    Immune Gene Linked to vCJD Susceptibility

    1. Michael Balter

    Researchers have found that a common variation of an immune system gene may offer some protection against variant Creutzfeldt-Jakob disease (vCJD)—the fatal neurodegenerative disease linked to eating cattle infected with “mad cow disease.” The new finding—the second genetic factor discovered so far that influences susceptibility to the disease—may help to identify high-risk individuals and provide some clues to the modus operandi of this mysterious, incurable malady.

    Most researchers believe that vCJD and similar diseases in humans and animals are caused in whole or in part by aberrant proteins called prions, which are misfolded versions of a normal cellular protein called PrP. Infection with vCJD appears to be caused by ingesting prions in contaminated meat.

    Researchers have long known that an individual's genetic makeup influences susceptibility to the disease. One genetic factor exerts a particularly powerful influence: So far, every one of the more than 100 people diagnosed with vCJD in the United Kingdom has both copies of the PrP gene producing the amino acid methionine at position 129. And two papers published earlier this year in the Proceedings of the National Academy of Sciences showed that at least seven other genes affect susceptibility to prions in mice. “We knew that there must be a whole range of other genes” that determine prion disease risk in humans, says neuropathologist James Ironside of the National CJD Surveillance Unit in Edinburgh, U.K.

    Now, one of those genes may have been identified: A team led by neurologist John Collinge of Imperial College, London, reports in this week's issue of Nature that more than 80% of vCJD victims lack a fairly common version of an immune system protein called human leukocyte antigen (HLA), which helps the immune system recognize infectious agents.

    Prion proof?

    A common genetic factor may give protection against the human form of mad cow disease.


    The HLA proteins, which vary from individual to individual depending on their genetic makeup, come in many different forms. One form, called HLA-DQ7, was found in 70 out of 197 normal control individuals, or 35.5%. But Collinge's team found that of 50 vCJD patients studied, only 6, or 12%, had the DQ7 variety of HLA. The researchers conclude that DQ7 somehow protects individuals from vCJD. One possibility would be that DQ7 helps the immune system fight off prions. But DQ7 could play another role: Recent studies suggest that prions must travel through immune system organs such as the lymph nodes and the spleen to get from the gut to the brain. If HLAs are involved in this transport, for example by binding to the prions—and if DQ7 is less efficient in this role—this could also explain the results.

    “We know that lymphoid organs are crucially important,” says neuropathologist Adriano Aguzzi of the University of Zürich. “So it hardly comes as a surprise that the second [genetic factor] to be discovered” is an immune system component. But Aguzzi, as well as the paper's authors, caution that DQ7 may play only an indirect role: It could just be closely linked to other genes that are more directly controlling susceptibility to vCJD. “Time, additional patients, and additional genetic studies will be needed to settle this issue,” Aguzzi says.


    Genetic Change Wards Off Malaria

    1. Elizabeth Pennisi

    Researchers have determined that a genetic alteration can provide almost complete protection against malaria, which in Africa kills 3000 children each day. The alteration produces a form of the oxygen-transport molecule called hemoglobin C. Another version of hemoglobin, hemoglobin S, also helps ward off malaria, but that protection comes with a steep cost: People who inherit two copies of the hemoglobin S gene develop severe sickle cell anemia. By contrast, hemoglobin C seems to have no adverse effects. Understanding how hemoglobin C works could help guide the development of vaccines and drugs, says Thomas Wellems, a malaria researcher at the National Institute of Allergy and Infectious Diseases in Bethesda, Maryland.

    Earlier work had hinted at a protective role for hemoglobin C. Now, in the 15 November issue of Nature, malaria researchers Mario Coluzzi and David Modiano of the University of Rome La Sapienza and their colleagues demonstrate its value. The work “shows that hemoglobin C is much more important than hemoglobin S,” comments Francisco Ayala, a population and evolutionary geneticist at the University of California, Irvine, who has studied the origins of this disease.

    Good genes.

    Some West African children have antimalaria hemoglobin.


    Modiano and Coluzzi and their colleagues pinned down hemoglobin C's role by studying 4348 Mossi children in Burkina Faso, West Africa. They took blood samples from 3500 healthy children, 359 children with severe malaria, and 476 children who had less serious cases. The team then determined which versions, or alleles, of the hemoglobin gene were present in each sample.

    They found a striking difference between healthy and sick children in the ratio of those with two copies of the typical hemoglobin allele, A, to those with two hemoglobin C genes, or to those with one of each. Few sick children had one or two hemoglobin C alleles, suggesting that the allele is beneficial. Indeed, the researchers found just one malaria patient who had two copies of the hemoglobin C gene, whereas 14 would be expected had there been no protective effect. Moreover, extenuating circumstances may have increased that child's vulnerability to malaria, says Coluzzi. Having a double dose of hemoglobin C “is almost complete protection,” Coluzzi explains.

    Coluzzi cautions that these results won't translate quickly into better treatments. But they do speak to how these hemoglobin genes evolved in African populations—a history that is quite intriguing, says Clive Shiff, a malaria researcher at Johns Hopkins School of Medicine in Baltimore. For example, if two copies of hemoglobin C are as beneficial as these data show, “one would expect hemoglobin C's [prevalence] in the population to change very fast,” he explains. If natural selection were at work—and if there were no hidden fitness costs to having this hemoglobin—many more people with hemoglobin C would survive to reproduce than would those with other alleles. Thus, their descendents should quickly predominate. Yet the other hemoglobin alleles seem to be persisting through time, Shiff points outs.

    Coluzzi attributes this phenomenon to the availability of malaria treatments. He estimates that the genetic mutation that created hemoglobin C occurred only about 1000 years ago in the Mossi and that the allele is now spreading rapidly among that group. But it would replace the other alleles only “in the absence of drugs,” he explains, because drugs enable those with malaria to survive and reproduce.

    To find out more about hemoglobin's place in humankind's fight against malaria, Coluzzi's team is keeping track of the children with this allele to see if their hemoglobin continues to keep them healthy.


    Cell Phone Lawsuits Face a Scientific Test

    1. Mark Parascandola*
    1. Mark Parascandola is a writer in Washington, D.C.

    Evidence that radiation from cell phones increases the risk of brain cancer has not fared well in the court of scientific opinion; is it acceptable in the court of law?

    Christopher J. Newman, a 41-year-old Baltimore neurologist, began using a Motorola cell phone in 1992 to keep in touch with his patients. In 1998, he was diagnosed with a life-threatening brain tumor, which he concluded was caused by the phone. Two years later, he filed suit against Motorola and several other manufacturers and service providers seeking $720 million in compensation and punitive damages. Newman's tumor, the complaint reads, is “a direct and proximate result of the defective and unreasonably dangerous condition of the relevant products,” which the manufacturers failed to prevent or warn him about.

    Newman's case is not the first of its kind, but it represents an escalation in the battle over claims that public health is threatened by electromagnetic radiation from cell phones and other devices. It also marks the crest of a new wave of such lawsuits filed by plaintiff lawyers across the United States since last spring in Louisiana, Pennsylvania, New York, Maryland, and Georgia. And it could have broad implications for other cases, because it will use some new, science-based standards to determine whether these claims should be examined by the court.

    The Newman case is being pursued by Baltimore superlawyer Peter Angelos, who is also behind some of the other lawsuits. Angelos, owner of the Baltimore Orioles, made a fortune in the 1980s suing asbestos companies on behalf of workers and headed Maryland's litigation against the tobacco industry. The Angelos team faces substantial challenges. In civil actions the burden of proof rests squarely on the plaintiff, and several complaints alleging phone-induced cancer have already been dismissed. Most of the human health studies completed so far have failed to find any harmful effects from cell phones, and most experts agree that the science remains uncertain. Newman's case, which is the closest to trial, will face a critical review on its scientific merits when it comes before a federal judge for a pretrial inquiry in Baltimore next January.

    Cause or coincidence?

    Plaintiffs link cell phone use to a variety of cancers, although they can point to only scant epidemiological evidence.


    Courts have been under increasing pressure to screen cases for merit and keep “junk science” from going before a jury, but they are moving at their own pace. Says David Faigman, a professor at the Hastings College of Law in San Francisco and author of Legal Alchemy: The Use and Misuse of Science in the Law, “We're very much in a transition period. The law has finally joined the scientific revolution, but it will be some time before the culture of science makes its way into the caverns of the law.” All eyes will be on Baltimore to see whether the case will pass this crucial test.

    Judging the science

    The opening salvo in the battle over cell phones took place during a TV show. On 21 January 1993, Florida businessman David Reynard announced on CNN's Larry King Live that he was suing the manufacturer of his wife's cell phone, along with the service carrier, because he believed that it had caused her fatal brain cancer. She had been using the phone for less than 1 year, but a magnetic resonance imaging scan showed her tumor to be, according to Reynard, “directly next to the antenna, and [it] seemed to be growing inward from that direction.” Stocks of cellular phone companies dipped the following week, and the industry, in a gesture of concern, later agreed to put up $25 million for research on the health effects of cell phones.

    But when pretrial maneuvering began in Reynard's case in 1995, the evidence proved insubstantial. Reynard had relied heavily on an expert witness named David Perlmutter, a neurologist who runs an alternative and complementary medicine clinic in Naples, Florida, and who had appeared with Reynard on talk shows. In a written statement, Perlmutter admitted that no studies had shown any adverse biological effects from cell phones, but he suggested that studies using animals and cell cultures “provide strong inferential data that use of this device represents a clear health hazard.” The court dismissed the case, stating that Perlmutter's expert opinion “did not meet [the] Daubert standard for admissibility of scientific evidence.”

    This standard was established in the landmark 1993 U.S. Supreme Court ruling, Daubert v. Merrell Dow Pharmaceuticals. In that case, parents claimed that the use of a morning sickness drug had caused birth defects in their children, but their evidence was limited to a few animal studies and unpublished reanalyses of data from negative epidemiologic studies. The Supreme Court agreed to hear the case because the legal system had been grappling with the appropriate standards for admitting scientific testimony into the courtroom, particularly for “toxic tort” lawsuits. The Daubert decision, released just 5 months after Reynard appeared on the Larry King show, made judges responsible for ensuring that a scientific expert's testimony is based on evidence that is both reliable and relevant.

    Legal reformers had been pushing for such changes for years, frustrated that any medical doctor could give an opinion in court about the causation of injury—even in the absence of evidence. In 1998, the Supreme Court raised the bar higher in the case of General Electric Co. v. Joiner, in which lung cancer was attributed to polychlorinated biphenyl exposure. The Supreme Court urged judges to analyze an expert's reasoning, conclusions, and the studies they are based on. It added that judges may exclude expert testimony when they “conclude that there is simply too great an analytical gap between the data and the opinion proffered.”

    Michael Green, an expert on product liability law and professor at Wake Forest University School of Law in Winston-Salem, North Carolina, affirms that “Daubert has made a sea change of difference.” Last year, a study of expert testimony conducted by the Federal Judicial Center found that judges are holding more pretrial hearings on evidence and excluding more expert testimony than in the days before Daubert. In some cases, such as in the breast implant litigation, judges have even hired their own expert advisers (Science, 3 January 1997, p. 21).

    Gold standard

    In the years since Reynard's lawsuit, results from several epidemiologic studies of cell phones and cancer have trickled in. Three studies released last winter amid much publicity failed to find any adverse effects. Two of them, one funded by the National Cancer Institute and one by the telecommunications industry, are based on interviews with several hundred brain cancer patients about their cell phone use and with controls who did not have cancer. The third, a massive Danish study, matched the names of 420,000 cell phone subscribers with the Danish Cancer Registry to determine whether they had an unusually high cancer rate; it also found no association. The Journal of the National Cancer Institute followed up with an editorial by physicist and junk-science warrior Robert L. Park of the University of Maryland, College Park, concluding that “we now have a convincing answer” to the question.

    Courts and regulators generally consider epidemiologic studies such as these to be the gold standard for measuring human health risks. But they still drew criticism from some researchers. A major limitation in all of them, the authors admit, is that most of the subjects had been using their phones only for 1 to 3 years, whereas cancer takes 10 to 15 years to develop. Swedish epidemiologist Kjell Hansson Mild explains that in order to detect a modest increased cancer risk from cell phone use, “we need a long latency period and large numbers of people. A 10-year latency is what you should aim for.”

    Prime mover.

    Baltimore attorney Peter Angelos has orchestrated a class action suit against cell phone-makers and service providers.


    Critics have also questioned whether these studies measured exposure accurately. In the Danish study, all cell phone subscribers were categorized as “exposed,” whether they used their phone only for emergencies or gabbed for hours at a time. Bruce Hocking, an Australian specialist in occupational medicine and former medical officer for an Australian telecommunications company, wrote in a letter to the Journal of the National Cancer Institute that some subscribers may also have used an earpiece or other device that kept the phone's antenna away from their head, leading to “appreciable imprecision” in the data.

    One epidemiologic study, not yet published, has found an increased risk of benign brain tumors. In June, Mild and colleagues at Sweden's Örebro University announced at a conference in London that they had found a 35% increase in risk among 5-year subscribers, which rose to 77% among 10-year subscribers. The increase was seen only for analog phones, but, the researchers cautioned, the newer digital phones, which put out less radiation, might not have been around long enough for their effects to be seen. They also found an increase in malignant tumors, although it was not statistically significant.

    Epidemiologic research would be much further along if it were not for another lawsuit that shut down a potentially valuable study. With industry funding, epidemiologists Kenneth Rothman and Nancy Dreyer of the Epidemiology Research Institute (ERI) near Boston planned to cross-reference subscriber records of 300,000 cell phone users with the National Death Index. They hoped to compare the cause of death with the time the deceased had spent on the phone. But they were able to analyze only 1 year's worth of deaths, not enough to expect to see an effect, before a Chicago law firm claiming to represent 40 million cell phone users brought a class action lawsuit that halted their work in 1995.

    The suit, which sought damages for alleged harm from cell phone use, claimed that the companies had conspired to thwart investigations into product safety. The suit also took aim at the ERI study, charging that industry and the researchers had invaded subscribers' privacy by accessing their phone records. Rothman and Dreyer's research remains on hold while the lawyers continue to fight over it. “The lawsuit is completely without any merit,” says Rothman, “but it has been very effective in killing our study.”

    Some researchers believe it will be at least a decade before consistent epidemiologic findings can settle the question either way. There are few shortcuts in epidemiology, and because cell phone technology changes (from analog to digital, for instance), the target itself may prove elusive. As Rothman wrote in The Lancet recently, “it is too soon for a verdict on the health risks from cellular telephones.”

    Proof by other means?

    The lack of positive epidemiological data has not put a brake on cell phone litigation. The Angelos team charges that cell phone firms have misrepresented data and “continue to manipulate science to the detriment of consumers by failing to reveal all relevant findings and by selectively withholding important public health information from the public.” The group cites the work of Henry Lai and Narendra Singh of the University of Washington, Seattle, who have found DNA strand breaks in the brain cells of rats exposed to low-intensity electromagnetic radiation at 2450 megahertz, like that emitted by a mobile phone's antenna. However, the legal complaint fails to note that another group led by Robert S. Malyapa of Washington University in St. Louis, Missouri, has tried and failed to replicate that experiment.

    For the Newman case, lawyers at Angelos's firm plan to call on Lai and seven other scientific experts. The group includes researchers who have appeared in previous scientific and legal disputes over health effects linked to power lines, microwaves, and cell phones. For example, Andrew Marino, a professor at Louisiana State University Health Sciences Center in Shreveport, who holds a Ph.D. in biophysics and a law degree, testified against the siting of power lines in New York state in the 1970s, arguing that they posed a human health hazard. And climatologist Neil Cherry of Lincoln University in Canterbury, New Zealand, has reanalyzed data from negative human studies of radar technicians to claim that increased cancer rates can be detected.

    View this table:

    Given the negative epidemiologic findings, the debate will likely focus on laboratory studies of cell cultures and animals, such as the Lai and Singh work. The Angelos team also claims that research during the 1960s established that radio-frequency radiation—the portion of the electromagnetic spectrum that includes radiation from microwave ovens, radar, cell phones, and radio and television broadcasts—was “capable of producing biological injury.” Physicist Allan Frey, a consultant to the Angelos team, helped pioneer this area of research through his studies of the effects of microwaves on the blood-brain barrier in animals. Because electromagnetic radiation is employed by many biological processes, he says, “it would be unbelievable to think that there would not be some effects. It is so fundamental to biology.”

    But other scientists are skeptical. “No physicist will say anything is impossible. But this is somewhere between implausible and impossible,” says John Moulder, a professor of radiation oncology at the Medical College of Wisconsin in Milwaukee. Moulder has also been a consultant to parties in the ongoing lawsuits, although he would not say to whom. David Savitz, a professor and chair of epidemiology at the University of North Carolina School of Public Health in Chapel Hill, points out that earlier epidemiologic studies of radar workers have failed to find any human health effects from radio-frequency radiation. He says cautiously, “There is empirical evidence within certain bounds that there is not an adverse effect.”


    Epidemiologist Kenneth Rothman's study of deaths among cell phone users was halted by a class action lawsuit.

    Even if a convincing case can be made that cell phones leave a biological imprint, some of Angelos's own experts admit that the link to brain cancer remains speculative. Epidemiologist Eli Richter of the Hadassah-Hebrew University Medical School in Jerusalem, who will testify for Newman, says of the animal studies, “None of these things prove causation. They just look at mechanisms of possible biological effects. It will be the epidemiology that will determine causation.” Even Lai admits that the human health implications of short-term animal studies are uncertain. “The question is: Will cumulative exposures cause effects? We don't have an answer to that; there are not enough studies.”

    Class actions

    Perhaps acknowledging the difficulty of proving causation, a new wave of class action lawsuits filed this year, unlike the Newman case, does not claim that anyone actually developed cancer from a cell phone. Instead, the lawsuits allege a pattern of “fraudulent and conspiratorial conduct” and “deceitful and misleading statements.” In particular, they claim that the cell phone industry failed to adequately test its products before putting them on the market and failed to warn consumers about possible health risks. These lawsuits demand that the industry provide all users with earpieces so that they can talk without holding the phones next to their heads.

    However, plaintiffs in the class action lawsuits will still face the challenge of proving that cell phones pose a hazard. Most of these cases, including the Newman case, were initially filed in state courts. Because some states have not yet adopted Daubert, plaintiff lawyers may be hoping to benefit from a less rigorous review of their scientific claims. Many of the early skirmishes have been over where the cases will be heard, and some have already been moved to federal courts, where the Daubert standard is more likely to come into play.

    Next January, if all goes according to schedule, lawyers in the Newman case will debate the science in a pretrial hearing before a Baltimore federal judge. If she finds that Newman's arguments pass the Daubert test, the case may get on track for a full-scale trial. If not, the case could effectively come to an end without ever reaching a jury. The hearing will be a critical challenge of Newman's claims, but more importantly, it will test the courts' new standard for vetting science.


    Drug Magnate Applies Strong Therapy at Imperial

    1. John Pickrell*
    1. John Pickrell writes from Hertfordshire, U.K.

    The former Glaxo Wellcome chief hopes his experience of the corporate battleground will help rejuvenate a top research institution

    LONDON—It can be frustrating being the also-ran, always in someone else's shadow. Imperial College knows exactly how it feels. Ask anyone on the street to name Britain's top research universities, and Oxford and Cambridge will invariably get a mention. Ask about number three, and you may get blank looks. But now, after 100 years as runner-up, Imperial may be about to offer a strong challenge to the leaders.

    Earlier this year the college, situated in a leafy and expensive west London district among sprawling gothic museums and avenues of smart Victorian residences, broke with tradition and appointed a nonacademic as its new rector. At the helm now sits Richard Sykes, until recently chief executive of Glaxo Wellcome and chief architect of a merger that formed GlaxoSmithKline, the world's largest pharmaceutical company. Sykes is now hard at work reinventing Imperial. He's not interested in joining the Oxbridge club. Instead, his sights are set farther afield. “I don't want the U.K. as our benchmark,” he says, preferring to model the college on the top U.S. research universities—the likes of the Massachusetts Institute of Technology, the California Institute of Technology, and Stanford University—with stronger links to industry and a sizable endowment to allow financial independence.

    Views differ on whether he can achieve this goal, but there's little disagreement that Imperial needs to hone its competitive edge. Over the past decade, British higher education has undergone dramatic changes. Nine years ago, 29 polytechnic colleges were elevated to university status, increasing the total by more than a third. Student numbers have also swelled, with more than 35% of school leavers entering university today compared to fewer than 10% 3 decades ago. There are now many universities with research reputations eager to usurp Imperial's number three spot.

    The college itself has also changed enormously over the past decade. Imperial, which was founded in 1907 and traditionally specialized in science and engineering, has now acquired a large medical school and an agricultural college. The number of students—9900 undergraduates and graduates in the 1999–2000 academic year—is up 30% in the past 5 years. “We have all the disciplines that can be brought to bear on the big issues, [and] we attract some of the very best people,” says Sykes. “The challenge will be to ensure that we realize all this potential.”

    Sykes was an inspired choice for a university keen to strengthen ties with industry. Sykes became head of Glaxo in 1993 and guided mergers with Wellcome in 1997 and SmithKline Beecham last year. He retains a figurehead position as nonexecutive chair of GlaxoSmithKline. Prior to his managerial career, Sykes was a researcher, earning a doctorate in 1973 from Bristol University and then working as a microbiologist for many years, both at Glaxo and at Princeton's Squibb Institute for Medical Research. He became a fellow of the Royal Society, Britain's premier scientific club, in 1997.

    No time for reflection.

    Richard Sykes wants to model Imperial College on the likes of the Massachusetts Institute of Technology.


    When Sykes arrived at Imperial in January, he found about 35 academic departments all reporting directly to him, a model he deemed totally unmanageable. Following much expansion in the past 10 years, the existing structure was “disorganized and compartmentalized,” agrees John Pendry, principal of the new faculty of physical sciences. Sykes quickly reorganized the jumble into four faculties—engineering, life sciences, medicine, and physical sciences—which will be established officially next August. Part of the rationale, he says, is to encourage cross-disciplinary research: “With four faculties, the focus is clear and each can drive the mechanisms of bringing [disciplines] together.”

    Researchers have welcomed the reforms, even if they are a bit breathless at the pace of change. “Sir Richard doesn't muck about,” says Andy Purvis, an evolutionary biologist. “It seems a good strategy, which inertia wouldn't have been. … The old setup was showing its age, both because recent college expansion has outgrown it and because research trends have brought some previously distinct fields together.”

    With internal restructuring under way, Sykes is turning his attention to money. Although Imperial has a large research budget—$174 million from grants and contracts in 1999–2000, again second only to Oxford and Cambridge—funding is hand to mouth and reserves are small. Like virtually all British universities, running costs are covered by government grants and minimal tuition fees paid by students, whereas research is supported by the government-funded research councils, foundations, and industrial contracts. Sykes, however, wants Imperial to have more financial freedom.

    Greater independence will be vital, Sykes says, to attract the best researchers. “We want to be able to offer increasingly decent packages to encourage the best,” says Sykes, “to offer them something in the same way as the American universities can, because they have large endowments.” A healthy endowment would also allow a more flexible research strategy. “We need access to money so that we can do things quickly,” such as refocusing research on a topical problem, says Sykes.

    But where will this nest egg come from? Imperial can't hike tuition fees, which are set by the government, so Sykes plans to copy an American tradition: tap alumni for cash. “We haven't gone after the money in a professional way like the Americans do,” says Sykes, who has now set up an office to raise funds from past students. One former student, technology investor Gary Tanaka, has recently set the ball rolling by bestowing $36 million to create a new business school and upgrade dated 1960s entrance buildings. “Just think of the wealth generated by the people who have come out of here in the last 100 years,” says Sykes.

    Sykes also hopes to increase the financial returns from Imperial's intellectual output. Imperial is no stranger to industry: It has generated 57 spin-off companies and begets new ones at a rate of about two a month. “These will have a significant role in moving the college forward,” says Sykes, who boasts that some are quoted on the stock exchange and that the college has already reaped wealth from investments in these companies. Others are less enthusiastic. “Spin-offs will never bring in large proportions of funding. Even the most successful institutes in this field only generate a few percent of their income from them,” says Peter Cotgreave of the pressure group Save British Science.

    Sykes's keen commercial instinct is also persuading him to develop the Imperial brand: He has assigned several senior members of his administration the task of raising the college's profile and selling Imperial abroad. “In the past we have been arrogant enough to believe marketing wasn't necessary,” says Chris Towler, director of strategy development, but this is no longer viable in today's competitive world. Sykes admits, however, that developing a brand to match Oxford or Cambridge will be a challenge. “We're up against 700 years history,” he says. But he's determined to give it a shot.


    A True-Blue Vision for the Danube

    1. Karen F. Schmidt*
    1. Karen F. Schmidt is a freelance writer in Bucharest.

    Romanian scientists are at the forefront of a European effort to balance the protection and exploitation of vast, diverse wetlands

    BUCHAREST—In 1983, dictator Nicolae Ceausescu decreed that the Romanian Danube delta, one of Europe's largest wetlands, be diked for growing rice and maize. The edict came despite evidence that the soil was too salty for agriculture and after industrial-scale reed production in the 1950s and fish farming in the 1970s had produced disastrous results. “Every time the scientists had the opposite opinion, but these were political decisions,” says Basarab Driga of the Romanian Academy's Institute of Geography in Bucharest. Nearly 15% of the delta had been transformed into marginal cropland by December 1989, when both Ceausescu and his grand plans for the Danube were laid to rest—just in the nick of time, say many Romanian scientists.

    Fast forward to 30 April 2001, when, ironically, Ceausescu's extravagant House of the People here in the Romanian capital hosted a major conference on the Danube region. In a speech, Romanian President Ion Iliescu, a reformed Communist, acknowledged that past economic development along the Danube had caused “unacceptable material and human costs.” He vowed to cooperate with 13 other European countries on an ambitious effort to restore the Danube—particularly its unique delta wetlands—while economically energizing the mainly impoverished region.

    In their efforts to undo the ecological harm of the past, Romania and other countries are trying to implement the trendy, complex notion of sustainable development. Although this term means different things to different people, in Romania, at least, scientists are poised to play an important role in studying the Danube delta's pollution and wildlife and advising the government on policies to remedy the watershed's problems. “Scientists are very important, because they are the ones who can imagine new processes, who can try to make activities more friendly to nature but at the same time economical and efficient,” says George Romanca, an ecologist at the National Center for Sustainable Development, funded by the United Nations Development Program. “It's our duty to realize projects that will lead to long-term development.”

    How well they will succeed, however, is an open question—particularly in the Danube delta, where one environmental problem is often traded for another, and public understanding and support for conservation is weak. Still, Angheluta Vadineanu, head of Bucharest University's Department of Systems Ecology and Sustainable Development, is cautiously optimistic: “I don't know how much the politicians understand about sustainable development, but it's there in the documents. That means there's a new model to follow for economic development, and that's a very important step.”

    The delta shows vividly how this post-Communist experiment could play out. The entire watershed, says ecologist Kate Lajtha of Oregon State University in Corvallis, “is a great natural lab.” It's a prime place to answer ecological questions underpinning sustainable development: how far pollution travels and how it affects fish, for example, and how farming practices can affect water quality all the way down the Danube River.

    A new vision.

    Romania's Cold War dictator, Nicolae Ceausescu, hoped to transform the Danube delta (box) into cropland. Today, scientists are helping to forge a sustainable development plan for the delta's rich but fragile resources. (The Danube River basin is shown in green.)


    Adapt or die

    As the Danube River bleeds into the Black Sea, its sediments fan out over 4200 square kilometers—3500 in Romania and 700 in Ukraine—and form a mosaic of 32 types of ecosystems. These include shallow floodplain lakes speckled with white water lilies, sand dunes with liana-covered Balkan oak-ash forests, and what is believed to be the world's largest stand of reedbeds.

    This diversity creates a haven for creatures rarely seen elsewhere in Europe; its most famous denizens are the 320 species of birds, including white and Dalmatian pelicans. In winter, more than half of the world's red-breasted geese stop there, and in spring and summer, pygmy cormorants gather in large, raucous colonies.

    Other wildlife leads a stealthy existence. While creating the first Red List of delta species last year, “we discovered 37 species new to science!” exclaims Mircea Staras, scientific director of the Danube Delta Research Institute (DDRI) in Tulcea. Among the finds are many wetland insects and an endangered fish, Knipowitschia, that lives in fresh and brackish water and is a key link in the food chain of pike perch.

    Fishers have been attracted to the delta since Neolithic times; fortress walls and mosques of past settlers are common sights. The biggest human impacts, however, came in the 20th century—and not just because of Ceausescu. Shipping canals built in the late 1800s to ease transport to the Black Sea were widened and branches were added, which changed the water flow from the Danube and allowed pollutants to infiltrate deeper into the wetlands, says Nicolae Panin, director-general of Romania's National Institute of Marine Geology and Geo-Ecology. In addition, the Iron Gates dams built upstream in the 1970s and 1980s cut sediment supply by half, causing catastrophic beach erosion of up to 20 meters per year in some areas and allowing more saltwater from the Black Sea to move into the delta's freshwater lakes and ponds.

    These changes eroded the wetlands' ability to sequester and detoxify pollutants, which were accumulating due to increased industrial effluents, pesticide and fertilizer runoff, and sewage from 80 million people in the Danube basin. By the late 1970s, algal blooms linked to nutrient pollution and oxygen starvation, or eutrophication, began appearing with alarming frequency.

    Around then fish populations also began to change, says Staras. Carp and other species requiring access to upstream spawning areas declined in numbers, as did those that thrive in clear water, such as pike. Species able to adapt to turbid conditions and algal blooms, such as bream, became more common. And since beginning their decline 3 decades ago, the Danube's beloved sturgeons have dwindled from six species to two, Staras says.

    While some wildlife has suffered, overall the delta has withstood the human onslaught surprisingly well, according to Panin. As polluted as it is, compared to other European wetlands the Danube delta “is still one of the cleanest and most natural deltas in Europe,” Panin says. That's why in 1990, almost immediately after the Communist regime crumbled, a large portion of the Romanian Danube delta was declared a biosphere reserve, joining a United Nations network of sites dedicated to sustainable development. Approximately half of the 5800-square-kilometer reserve was designated an economic zone where some fishing, farming, forestry, and habitation is allowed. About 9% of the reserve became strictly protected areas—because of their rare and sensitive species—that are off limits to all but researchers with permits. The remaining 40% is buffer zone with limited human activity.

    Voyage of discovery.

    Scientists with the Danube Delta Research Institute, here on a monitoring day trip, found 37 new species in the delta in recent surveys.


    These designations were a leap into the unknown: the start of an experiment to balance human uses while protecting natural resources. The reserve boasts the first management plan, the first public awareness strategy, the first ecological information center—“the first for us in everything,” says Grigore Baboianu, executive director of the government's Danube Delta Biosphere Reserve Authority.

    Today, officials and scientists such as Baboianu face the challenge of making their experiment succeed against a backdrop of continuing economic and political change. So far, many of the variables have worked in their favor. In 1995, for instance, Oregon's Lajtha found that when compared with other wetlands worldwide, the Danube delta did not appear to be heavily polluted with metals—a benefit, perhaps, of many Communist-era industries' having closed. Fertilizer and pesticide runoff is also less of a threat, because Romanian farmers can't afford chemicals and are abandoning cropland. “The Danube delta was full of nutrients and suffering from eutrophication, but it's getting better now,” says Baboianu. These days, algal blooms occur mostly in the summer.

    But many people fear that those trends could reverse themselves as economic development kicks in again. Already some of the delta's fish, for example, are more beleaguered than ever. Under Communism, fishing was a state-run industry that used only low-tech nets and other equipment. In the past decade, the free market has ushered in high-tech gear, while weak laws and poorly paid civil servants fail to limit catches. “It's a disaster,” says Staras. “Nobody knows how many fishermen there are.” Romania issued a law in April requiring licenses and closed seasons, and which establishes a new agency to regulate fishing, but Staras is dubious. “Because the law came so late,” he says, “it will be very difficult to correct the situation.”

    Turning buzzwords into action

    Similar problems may soon arise on land, as Romania attempts to reinvigorate its agricultural sector—efforts that may boost fertilizer use. International conventions for protecting the Danube call for all Danubian countries to cut nutrient pollution 40% by 2010. Romania could hold the key to whether that goal is met. “We're concerned and hope that post-Communist countries take a different approach than the West,” says Jasmine Bachmann of the World Wide Fund for Nature in Vienna. Vadineanu's group recommends that Romania prevent runoff through smart landscape planning by placing buffer zones between fields and the Danube River, and by restoring 150,000 hectares of wetlands (see sidebarsidebar). He estimates that the cost of restoration—an estimated $275 million—could be recouped within 6 years from ecological goods and services provided by the delta, including nutrient retention, flood control, and rebuilding of fisheries.

    Healthy tension?

    One of the biggest challenges will be to limit fishing to designated parts of the delta.


    Selling these lofty ideas is not easy. Many delta inhabitants resent the fact that they are banned from entering the biosphere reserve's strictly protected areas and are allowed to catch only 3 kilograms of fish per day, for the sake of the nebulous concept of sustainable development. Romanca understands such sentiments. “After 10 years, there's been no change for the better for them,” he says. He blames the economic stagnation and concomitant resentment on heavy exploitation—of fish, mushrooms, reeds—by poor migrants and entrepreneurs looking for quick cash, and on weak law enforcement. Budget cuts have reduced the number of reserve wardens from 75 in 1995 to 40 today.

    A paltry science budget—just 0.2% of Romania's gross domestic product—has also provided little support for research at the delta. But one group, the DDRI, has managed to come out ahead. Under Communism the institute emphasized economics, but in the 1990s it changed its focus to ecology and shrewdly took advantage of the new biosphere reserve. As Staras puts it, “We trust in the national and international attraction of the Danube delta.” Indeed, the institute, located on the reserve's edge, won the lead research role in a $4.5 million World Bank biodiversity project that ended last year. The funding enabled the researchers to modernize labs and expand programs for monitoring water quality, vegetation, and wildlife populations. Well-equipped researchers, however, may not necessarily make a difference to the delta's ecology. “The money has been spent, but the fish stocks are worse off than before,” says DDRI biologist Zsolt Torok.

    Still, the DDRI and other Romanian institutes are now gearing up for more experiments to try to balance economic development with nature preservation. The European Union has provided $145,000 for a pilot project that, among other things, will evaluate habitats and species and restore alluvial forests at Braila Island, another wetland sanctuary along the Danube. The World Bank, meanwhile, has pledged $5 million to a project along the Romanian Danube that aims to make farming harmonious with conservation. A second World Bank program may plow $24 million into sustainable forestry.

    Environmental researchers in Romania should benefit from this surge in international funding. But whether they help achieve the goals of sustainable development remains to be seen. “Until now, we scientists have complained about the politicians,” says Vadineanu. “But now it's our turn to show that we can put this into practice.”


    Restoring the Vitality of Rich Wetlands

    1. Karen F. Schmidt*
    1. Karen F. Schmidt is a freelance writer in Bucharest.

    BUCHAREST—Nicolae Ceausescu made serious mistakes in the Danube delta, but at least they seem to be reversible. Several sites converted under Communism to agricultural fields, fish farms, and forestry projects are already being returned to nature. Now many more restorations are planned. As part of a World Wide Fund for Nature (WWF) project called “Green Corridor for the Danube” launched in June 2000, the governments of Romania, Bulgaria, Moldova, and Ukraine have pledged to create a network of at least 600,000 hectares of floodplain habitats along the Lower Danube River and the Prut River, and in the Danube delta. That will require ecological restoration of 200,000 hectares.

    The delta's Babina Island will serve as a model. Under a Ceausescu order, the riverine island of 2100 hectares was diked in 1985, drying out its ponds and rivulets. The salty soil was then plowed and crops were unsuccessfully cultivated. After the creation of the biosphere reserve in 1990, a team of ecologists from the WWF, the Danube Delta Biosphere Reserve Authority (DDBRA), and the Danube Delta Research Institute decided to try to undo the damage. In spring 1994 they breached the dams on Babina to reconnect the island with the river's flooding regime.

    Restoration in action.

    Since Babina Island's dams were breached in 1994, the swamp ecosystem has regained its vitality surprisingly quickly.


    Natural regeneration occurred surprisingly rapidly, according to the project leaders. By the second year, most of the aquatic and swamp plant communities had returned, as well as fish, birds, and other native creatures. “If we let nature work by itself, it's very wise,” says the DDBRA's Grigore Baboianu. “It's not necessary to have a complicated philosophy of restoration.”

    Next, the team opened the dikes at another riverine island and a polder once used for forestry. Last year, they began work at a failing fish-pond complex called Popina. “The local people were the first rehabilitators. They were making small openings to bring in fresh water,” says Erika Schneider of the WWF's Auen Institute for Floodplains Ecology in Rastatt, Germany. She and her colleagues punched more holes and are waiting to see if Popina's circulation has improved enough to bring back native fish.


    Nanoscientists Look to the Future

    1. Robert F. Service

    WAILEA, HAWAII—About 150 researchers gathered here from 28 to 30 October to discuss the latest big developments in the science of small. In between dips in the ocean and mountain hikes, participants heard about efforts to manipulate nanosized objects and improve optical data storage. And one young scientist offered hope that the future is in good hands.

    Expanding the Nano Toolbox

    If you've ever tried to remove a splinter with blunt tweezers, you know the frustration facing nanotechnologists. So far they have been better at spotting tiny objects than at moving them about. But at the Hawaii meeting they announced two important advances for those hoping to manipulate nanosized objects: a set of custom-built nanotweezers and a movable platform capable of making successive steps just 2 nanometers long.

    Together, the feats help lay the groundwork for a new era of manipulating objects in the nanoworld, says Metin Sitti, an electrical engineer at the University of California, Berkeley. “It's a very exciting time in the area of nanomanipulation. Lots of groups are making progress in this area,” Sitti says.

    One of those groups, led by Peter Bøggild of the Technical University of Denmark in Lyngby, reported a new way to custom-build nanotweezers to accomplish just about any nano-grabbing task. The tiny tweezers aren't the first to work on the nanoscale. In 1999, Harvard University chemist Charles Lieber and his colleagues made a set of nanotweezers by attaching a pair of carbon nanotubes atop separate electrodes on the end of a glass micropipette. By applying different electric voltages to the electrodes, Lieber's team could get the nanotube arms to pinch and release objects. But the technique creates a large electric field at the tweezer tips, which can alter the objects being manipulated. Moreover, the tweezers must be constructed one at a time, making the manipulation of nano-objects a slow and tedious process.

    Bøggild's group took up both these problems at once. The researchers first used standard micromachining processes to carve a small slab of silicon into what look like tiny pliers with tips jutting off the edge of the slab. Applying a voltage to three electrodes—one between the arms of the pliers and one on either side—drives the cantilever arms of the pliers together and apart.

    To turn the micropliers into nanotweezers, Bøggild's team used an electron beam from a scanning electron microscope to grow a tiny tweezer arm from the end of each of the cantilevers. The beam shattered hydrocarbon molecules present in a surrounding vacuum chamber, causing carbon deposits to build up at the focal point of the electron beam. By carefully directing the electron beam, the team angled the two tweezer arms together until the tips were some 25 nanometers apart. What's more, the initial micromachining processes can create electrode-driven plier arms by the batch. The work, Sitti says, is a “really good example of fabricating nanostructures.”

    But making nanotweezers is only half the challenge in manipulating objects in the nanoworld. The other half is moving them where you want them to go. To help accomplish this task, Takashi Shigematsu and Minoru Kuribayashi Kurowawa at the Tokyo Institute of Technology, along with Katsuhiko Asai at Matsushita Electric's Advanced Technology Research Labs in Kyoto, created a new “stepping drive,” which uses sound waves inside crystalline substrates to nudge a small platform. Kurowawa has spent years improving such drives, and the latest model reduces the step sizes 10-fold over its predecessor, to a mere 2 nanometers each.

    Robert Shull, a physicist at the National Institute of Standards and Technology in Gaithersburg, Maryland, calls the ultraprecise stage “an enabling technology for making manipulations on the nanometer scale.” It's also a device that should work well with the nanotweezers. “All of these nanomanipulators will need to be able to move with that kind of resolution,” Shull says. “You need something that can move in a controllable fashion. This appears to have that.”

    Taking Nanowires Beyond Gold

    Mariangela Lisanti is having a very good year. Last December, she won the Siemens Westinghouse Science and Technology Competition and a $100,000 college scholarship for work that reveals how electrons dart through wires just a few atoms thick. In March, the same project earned her another $100,000 scholarship and top honors at the Intel Science Talent Search, as well as a visit to the White House to meet President George W. Bush. In May, she pulled in several more awards at Intel's International Science and Engineering Fair, including an invitation to attend the 2001 Nobel Prize awards ceremony next month in Stockholm. And last month she delivered a plenary talk here alongside other nanotechnology notables, including 1996 chemistry Nobelist Richard Smalley. Not bad for someone who just started her freshman year at Harvard. “She is very impressive,” says Phaedon Avouris, a nanotechnology pioneer at IBM's T. J. Watson Research Center in Yorktown Heights, New York.

    Lisanti says modestly that these honors were unexpected. But an outside observer might have guessed. While at Staples High School in Westport, Connecticut, Lisanti was valedictorian, captain of the math team, founder and captain of the school's engineering team, and a concertmaster of the chamber and symphonic orchestras. Fluent in Italian and Spanish, she received numerous awards in language as well as in science competitions and was named a Governor's Scholar, the highest academic distinction in Connecticut.

    Lisanti says she's dreamed of winning a science fair for years. But getting there took perseverance. Her high school didn't have much tech-heavy equipment, so she asked her teachers for names of area professors who might mentor her. After several rejections, she hooked up in 1999 with Mark Reed, the chair of the electrical engineering department at Yale and a pioneer in molecular electronics. “Actually, my first answer was no,” Reed confesses. “But after a short conversation I became convinced she could handle it.”

    In 1997, Reed's group became the first to measure the flow of electrons through a single molecule (Science, 10 October 1997, p. 252). The team created an ultrathin gold wire, broke it, and then used chemical techniques to deposit gold-binding molecules in the gap. Under such conditions, electrons no longer flow in a chaotic stream as they do in large wires. Rather, they shuttle one by one in an orderly fashion that reveals their underlying quantum-mechanical nature.

    Fast start.

    Freshman Mariangela Lisanti is racking up science awards.


    Reed's technique works beautifully for gold contacts, but it is laborious and difficult to adapt for testing different metals. “We sought to develop a new technique that could collect larger data sets and work with other metals,” Lisanti says.

    Reed challenged Lisanti to find a mechanical scheme for repeatedly making and breaking a connection between pairs of nanowires, each of which would need to remain connected to measurement electrodes. After some thought, Lisanti suggested using the tiny vibrations of a ceramic crystal—which are used in door buzzers to create sound—to drive the movements of a small platform that in turn moved the wires in and out of contact. Reed gave his approval, and after a summer of 60- to 80-hour weeks, Lisanti got the setup to work.

    “As the wires pull apart, the break is not abrupt,” Lisanti says. “They reduce to only one atom, and so you can see the quantization” of electrons flowing between the wires. Not only did the setup work with various types of metal wires, but its speed enabled Lisanti to amass millions of measurements of electrons shuttling between nanowires. The volume of experiments turned out to be vital, because each measurement shows a slightly different behavior, due to the precise position of individual atoms at the junction.

    Lisanti found some surprising patterns. Most startling was the conduction of electrons between gold nanowires, when the junction between the two wires was still relatively intact and contained numerous gold atoms. Although one might expect the conductance to increase linearly as more and more atoms connect the adjacent wires, Lisanti found instead that it jumped up at regular intervals. That pattern suggested that there were certain “magic numbers” of gold atoms that conduct electrons better than other configurations. Lisanti and Reed have since hypothesized that the magic numbers are related to stable configurations in which the gold atoms are completely surrounded by their neighbors. In contrast, preliminary results suggest that copper nanowires don't show such islands of stability.

    Lisanti hopes someday to continue her experiments with Reed to see how general the phenomenon that she discovered is. But for now she's just trying to settle into her new life as a college freshman, in between trips to Hawaii and Stockholm.

    Lighting the Way Ahead for Data Storage

    Engineers hoping to pack longer movies onto DVDs are up against a formidable challenge: overcoming the physics of light. The devices use a laser to highlight tiny spots burned into the heart of a plastic disk. Unfortunately, researchers can make these spots only so small—about 400 nanometers—before the lenses and optical detectors can no longer see them. That's because diffraction causes light waves to spread out, blurring the image.

    Over the past decade, scientists have come up with a specialized technique, called a near-field scanning optical microscope (NSOM), that can spot features as small as 100 nanometers by placing the aperture of a detector so close to a surface that light bouncing off doesn't have room to diffract. But these microscopes do such a poor job of collecting light, it takes them too long to see spots in a rapidly spinning disk, making them useless for data storage.

    At the meeting, a group led by Kelly Pellerin and Tineke Thio of the NEC Research Institute in Princeton, New Jersey, reported a way to help NSOMs pick up the pace. The team created a specially designed aperture that funnels surrounding light effectively enough to increase light transmission 150-fold. The NEC researchers say they are working on improvements that should produce a 1000-fold enhancement, which may be enough to make NSOMs useful for data storage. “It's a powerful concept,” says Calvin Quate, an expert on scanning microscopes at Stanford University in Palo Alto, California. If the improved version works, Quate adds, “it would be terribly important.”

    NEC's light funnel takes advantage of an electrical effect called surface plasmon resonance, which occurs when light strikes a metal surface. Under the right conditions, this light causes electrons in the metal to oscillate back and forth. These oscillating electrons can dramatically boost the electromagnetic field at the edge of a small aperture in the film, a change that in turn helps light pass through. In 1996, NEC's Thomas Ebbesen and colleagues showed that by patterning a metal film with an array of tiny holes precisely arranged to increase the electron resonance, they could pass 1000 times as much light through the film.

    In their current work, Ebbesen—now at Louis Pasteur University (ULP) in Strasbourg, France—along with ULP's Henri Lezec and Thio and her colleagues at NEC, wanted to see if he could find a similar way to get more light through a single hole, which would be easier to integrate with an NSOM. To do so the researchers patterned a silver film to look something like a miniature bull's-eye, with a 400-nanometer hole surrounded by successive rings of corrugations in a silver film. When hit with light with a wavelength of about 800 nanometers, the corrugations created a sharp electron resonance effect and increased the ability of light to make it through the hole 150-fold compared with an aperture without the surrounding rings. That's still not enough to allow an NSOM to spot tiny dots on a spinning DVD, says Thio. But she adds that the team is already working on an improved version by following a seeming paradox: Making the aperture smaller should increase the amount of light that gets through, because it will enhance the electric field around the hole.

    The NEC group has yet to try creating such a structure on the tip of an NSOM. But that's another upcoming project, Pellerin says. If it works, the good news is that it may lead the way to DVDs that can store 50 times more data than the current variety. The bad news is that it means we'll all have to buy new DVD players to use them.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution