News this Week

Science  10 Dec 1999:
Vol. 286, Issue 5447, pp. 2050
  1. BIOMEDICINE

    NIH Sets Rules for Funding Embryonic Stem Cell Research

    1. Gretchen Vogel

    The National Institutes of Health (NIH) moved closer to funding work using embryonic and fetal stem cells last week, issuing proposed guidelines* that would allow publicly funded scientists to use these controversial materials. The carefully worded plan would rely on private labs to produce cell lines from embryos. But it would allow grantees to use the cells as long as they were derived according to strict requirements. The new rules are slightly more permissive for fetal stem cells, allowing publicly funded scientists both to derive and use them.

    Despite NIH's cautious approach, the guidelines are already under attack: Antiabortion activists have said they will try to get Congress to intervene next year and block support for all embryonic stem cell research. And even if that move fails, it could still be months before stem cell lines that meet NIH's requirements are available for widespread use.

    View this table:

    Many scientists believe that embryonic stem cells, which can become almost any cell in the body, hold great promise for basic research in developmental biology as well as for treating a range of diseases. But the cells have been controversial because researchers derive them from early embryos or fetal tissue. Federal law allows NIH to fund some research on fetal tissue but prohibits work that might harm a human embryo. In January, NIH's parent agency, the Department of Health and Human Services, ruled that the law did not forbid research on stem cell lines, because they cannot develop into a person once they are removed from the embryo (Science, 22 January, p. 465).

    In accord with that ruling, NIH published guidelines in the Federal Register on 2 December that spell out detailed criteria cell lines would have to meet before NIH-funded researchers could use them. For example, embryonic cell lines could come only from frozen “excess” embryos created during fertility treatments at private clinics; embryo donors would have to sign strictly worded consent forms; and the process of obtaining embryos for research would have to be separate from clinical procedures. Slightly different terms apply to fetal tissue: Federally funded scientists would be free to derive and work on such cell lines as long as they followed ethical standards similar to those already in place for other fetal tissue research. To monitor the field, the guidelines also would establish a Human Pluripotent Stem Cell Review Group, which would evaluate any newly derived cell lines to ensure they are in compliance with NIH rules. The guidelines are “a very thoughtful and very thorough response” to the political situation, says stem cell researcher Roger Pedersen of the University of California, San Francisco. And although he objects to a few of the requirements as unduly specific, “overall, it's a positive thing,” he says.

    But that specificity means existing cell lines probably won't meet NIH requirements. Indeed, the two scientists who derived cell lines described in papers last November (Science, 6 November 1998, p. 1014) agree that the consent process they used differed slightly from the NIH requirements and would be unlikely to pass muster. John Gearhart of The Johns Hopkins University School of Medicine, who derived his cell line from fetal germ cells, notes that his team's consent form does not include the required statement that the cells could be used “for many years.” The review panel might not accept the document as equivalent. The new guidelines also require that all identifiers associated with the embryos be removed before the cell lines are derived. Although the donors at Hopkins were anonymous, Gearhart says, he retained certain identifiers because the Food and Drug Administration would require information on the source if the cells were ever used to treat patients.

    James Thomson of the University of Wisconsin, Madison, who derived his cell lines from embryos, agrees that he “would probably have to derive new lines” to meet the guidelines, and he told Science that he intends to do so. But that would delay research. Thomson estimates it would take 6 months to derive, test, and grow enough cells to distribute to outside researchers.

    Both Thomson and Gearhart were funded in part by Geron Corp. of Menlo Park, California, but the University of Wisconsin's Wisconsin Alumni Research Foundation (WARF) owns the patent on Thomson's work and is “completely free to distribute cells to academic researchers,” Thomson says—“and we plan to do so.” Geron holds an exclusive license for certain potentially profitable uses of the cells, says Carl Gulbrandsen, patents and licensing director for WARF, but the university retains the right to distribute the cells for research. The university is setting up an independent, not-for-profit institute to handle requests. Gearhart has said he will decide whether to distribute his cells to other researchers once final guidelines are in place.

    The NIH's proposal could still run into trouble in Congress, however. Many observers expected the controversy to explode this fall. Representative Jay Dickey (R-AK) had proposed to amend NIH's appropriation bill specifically to bar work on embryo-derived cells. At the same time, a Senate advocate of stem cell research, Arlen Specter (R-PA), had proposed language that would be more permissive than NIH, permitting researchers to derive cell lines from human embryos as well as use them. Congressional leaders prevailed on both legislators to withdraw their proposals, however, and the appropriations bill signed into law last week is silent on the issue. As part of the compromise, Senate Majority Leader Trent Lott (R-MS) has promised a debate on Specter's bill in February. Both sides are gearing up.

    In the meantime, NIH is accepting public comment on the draft guidelines through the end of January. Once the final version is in place, NIH will begin accepting proposals for work on both embryonic and fetal stem cells, says Lana Skirboll, the NIH's associate director for science policy. Barring congressional intervention, she predicts the first grants could be funded as soon as next spring.

  2. PLANETARY SCIENCE

    Yet Another Loss to the Martian Gremlin

    1. Richard A. Kerr

    Failure at Mars is becoming drearily familiar: Since 1960, the United States, the Soviet Union, and Russia have launched 29 missions toward Mars, only eight of which could be called real successes. The dogged Russians and Soviets are zero for 16, engendering talk of a martian gremlin that lies in wait for unsuspecting spacecraft. Until this year, however, the Americans seemed to have dodged the gremlin, with an impressive eight successes out of 11 attempts. But in September confusion over English and metric units doomed Mars Climate Orbiter. And over the past weekend the Mars Polar Lander (MPL) went missing, giving the United States just two successes in the last five tries.

    While scientists mourned the loss of a chance to study martian water ice up close, mission planners at NASA and the Jet Propulsion Laboratory (JPL) in Pasadena, California, were at least as discouraged by the prospect that they may never know what sealed MPL's fate. In the wake of the loss, they offered two scenarios. MPL could have reached the surface intact but landed on a slope so steep that it tipped over. Or, just after it broke off radio communication with Earth, as intended, 12 minutes before its planned landing, it could have suffered some onboard problem—perhaps due to some undiscovered flaw in current designs that might turn upcoming Mars missions into gremlin bait as well.

    A complex sequence of mechanical operations was scheduled after the communications break. Outside Mars's atmosphere, MPL should have separated from a structure that had supported it during the cruise to Mars. It should have made a fiery entry behind its heat shield, deployed its parachute, jettisoned the heat shield, radar-locked onto the surface, and separated from the parachute. Then, using rockets to decelerate, it should have made a gentle touchdown on the surface.

    With no word from the lander, engineers don't know how any of that went. MPL had no way of communicating during its entry, descent, and landing, unlike its predecessor Mars Pathfinder, which landed successfully in 1997. Pathfinder's mission, notes project scientist Matthew Golombek of JPL, was to test a new airbag-cushioned landing method, so mission engineers documented spacecraft performance to the very end. In the case of MPL, spacecraft designers economized by leaving out the somewhat complex and expensive equipment needed for continuous communications. After all, two Viking spacecraft had successfully made rocket-braked landings on Mars in 1976, using much older technology.

    Even if MPL did make it down, other hazards awaited it. Like all other landings on Mars, it would have touched down on little-known terrain. According to Richard Zurek, MPL project scientist at JPL, images of the landing zone made by the orbiting Mars Global Surveyor showed a relatively smooth surface. But because a picture element in most of those images encompasses 4 meters, plenty of lethal hazards—car-sized potholes or meter-high ledges, for example—could be hiding in the apparently innocuous terrain. “We can't capture [lander-scale hazards] with the images we have,” says Zurek. “That's a risk you take when you go to Mars. We're going to places on Mars we haven't been before. You can't guarantee success.”

  3. TISSUE ENGINEERING

    Growing Human Corneas in the Lab

    1. Dan Ferber*
    1. Dan Ferber is a writer in Urbana, Illinois.

    If the eyes are the windows to the soul, the cornea is the windowpane, a tough but transparent layer of tissue that lets light through but protects the interior of the eye from the elements. But although a smudged window is easy to clean, a cornea clouded by injury or disease can impair vision and lead to blindness. Surgeons can often replace damaged corneas with healthy ones from organ donors. But the supply barely meets the demand, and few corneas remain for researchers, who need them to study corneal wound healing and eye diseases. That leaves none for toxicity testing of drugs and household products. As a result, manufacturers often test them on animals, usually rabbits. Now, researchers have taken a big step toward alleviating the cornea shortage.

    On page 2169, cell biologist May Griffith of the University of Ottawa Eye Institute, Mitchell Watsky of the University of Tennessee College of Medicine in Memphis, and their colleagues report that they have used lines of cultured human cornea cells to fashion the first working equivalent of a human cornea. Although the engineered cornea is far from ready for use in human transplants, it should come in handy for research, experts say. And for toxicologists, the artificial tissue is a sight for sore eyes, potentially offering a new way to conduct safety tests and thus reduce the use of animals. “I think it's beautifully done work,” says ophthalmologist and surgeon Stephen Foster of the Massachusetts Eye and Ear Infirmary in Boston.

    The researchers originally made the corneas because they wanted to understand what sometimes causes corneas to fail to heal properly after laser eye surgery. But the few human corneas they could get from the eye bank were often old or diseased. “The eye-bank corneas didn't work out, so we decided to build our own,” Griffith says.

    Corneas contain three types of cells: the epithelial cells that form the outer layer; the keratocytes that populate the middle, or stromal, layer; and the endothelial cells of the inner layer. The researchers first created their own stock of all three cell types by inserting viral genes that enabled the cells to grow indefinitely in the lab. Then, before assembling them into corneal tissue, they tested the cells to make sure they retained the traits of healthy corneal cells and did not have any telltale signs of cancer, such as the ability to grow in a Jell-O-like medium called soft agar.

    To construct the artificial corneas, the researchers first grew a thin layer of endothelial cells in a culture dish, covered that with a mixture of keratocytes and support proteins, and then layered epithelial cells on top. After letting the cornea mature for 2 weeks with its epithelial cells sitting at an air-culture media interface, just as in a real cornea moistened by tears, the researchers found they had a transparent tissue that behaved much like a human cornea. For example, a mild detergent activated the same set of wound-healing genes as it does in human corneas.

    The engineered corneas also clouded up in response to irritating detergents to about the same extent as human and rabbit corneas, says study co-author Rosemarie Osborne, a toxicologist at Procter & Gamble in Cincinnati, Ohio. The company, which has been dogged by animal-rights activists, co-sponsored the study as part of a long-term effort to devise new ways to test the toxicity of its consumer products without having to use animals. Here they hoped to find a reliable replacement for the Draize test, which measures how irritated the corneas of rabbits become following chemical exposure.

    The engineered corneas are “a significant advance” along that road, says Alan Goldberg of the Center for Alternatives to Animal Testing at The Johns Hopkins University School of Public Health. But he adds, “It's not yet an alternative to the Draize test.” Researchers must first learn to mass-produce the corneas, and they must confirm that the engineered corneas respond the same way a human eye does. As a step in that direction, the Procter & Gamble group has teamed up with another household-products giant, Unilever, and a nonprofit group called the Institute for In Vitro Sciences in Gaithersburg, Maryland, to examine, among other things, how the engineered corneas respond to known ocular toxins.

    For transplant surgeons, the corneas are clearly not ready for prime time. Even though the cells the researchers used did not have any of the telltale signs of cancer, they might still become cancerous later on. Before using them for human transplants, researchers would have to dispel that concern, among others, including that the engineered corneas don't provoke an immune response and that they remain transparent over the long haul. The corneas have “some potential” for use as transplants, Foster says, but “I think it's a long way off.”

    In the meantime, the engineered tissue could prove a boon to laboratory research on the cornea. “The work is exciting,” says corneal surgeon Terrence O'Brien of The Wilmer Eye Institute at Johns Hopkins School of Medicine. “It could be used as a model system to … answer a lot of fundamental questions.”

  4. PLANETARY SCIENCE

    Shaking Up a Nursery of Giant Planets

    1. Richard A. Kerr

    What are Uranus and Neptune doing so far from the sun? The question has puzzled theorists for decades. Unlike the closest six planets, which orbit the sun inside of 10 astronomical units (AU)—that is, less than 10 times farther out than Earth—Uranus and Neptune orbit at 19 and 30 AU, respectively. Theorists don't believe they could have formed so far out; there, gas and dust were too sparse to coalesce into planets. Now, a new computer model suggests that sibling rivalry might be to blame for their banishment. Runty Uranus and Neptune may have grown up in tight quarters much closer to the sun, only to have the big bruisers Jupiter and Saturn fling them into the outer reaches of the solar system.

    “It's a fascinating result,” says planetary dynamicist Brett Gladman of the Observatory of Nice in France. “I think it's marvelous it works.” Forming Neptune and Uranus where it's practicable and then having them thrown outward “seems dynamically plausible,” adds Stuart Weidenschilling of the Planetary Science Institute in Tucson, Arizona. “It certainly comes out ahead of any previous explanations.”

    The nine planets formed in a nebular disk of dust and gas, where chunks of primordial matter collided to form bigger and bigger chunks and eventually planets. For the next several million years, the resulting ice-rock cores of the outer planets grabbed gas from the nebula until the gas was all taken up or blown away by the sun. But out on the nebula's fringes, matter was spread too thinly for anything like planets to form. In the best simulations of the process, cores for Uranus and Neptune fail to form at their present positions in even 4.5 billion years, the lifetime of the solar system. “Things just grow too slowly” in the outermost solar system, says Weidenschilling. “We've tried to form Uranus and Neptune at their present locations and failed miserably.”

    Apparently, Uranus and Neptune formed somewhere else, presumably closer to the sun where the nebula was far denser. But how did they move outward billions of kilometers without disrupting their nicely circular orbits, which are in the same plane as the rest of the solar system? In this week's issue of Nature, planetary dynamicists Edward Thommes and Martin Duncan of Queen's University in Kingston, Ontario, and Harold Levison of the Boulder, Colorado, branch of the Southwest Research Institute demonstrate a two-step method that works, at least in their model. They assume that not just two but four or five ice-rock cores formed where Jupiter and Saturn now reside, between 5 and 10 AU—an assumption most theorists are comfortable with.

    Conventional thinking also has it that once a core reached a critical mass—about 15 times that of Earth—it would grow faster and faster as its increasing mass gave it greater and greater gravitational pull on the gas. By chance, in Thommes and Duncan's scenarios, Jupiter hit runaway growth first, letting it grab 71% of the total mass of the outer planets; Saturn came in second with 21%, but late bloomers Uranus and Neptune got only about 3% and 4% of the mass, respectively, leaving them at the mercy of nearby Jupiter and Saturn. In the first 100,000 years of many of the simulations, Jupiter and Saturn gravitationally fling their nursery mates into steeply tilted, highly elongated orbits that can carry them 30 or 40 AU outward.

    That's a dangerous situation for an undersized giant because the big guys could, and in some of the simulations do, eject a planet from the solar system entirely. But the debris remaining in the disk, which extends to 40 AU and beyond, can step in to defend the bullied planets. In a process called dynamical friction, innumerable gravitational interactions with bits of disk debris push Uranus and Neptune around as they pass through and over the disk. The dynamical friction eases the wildly orbiting planets once again into circular orbits in the plane of the other planets but beyond the disruptive influence of Jupiter and Saturn. Further planet-disk interactions can move the relocated planets even farther out. Of the 24 simulations Thommes and his colleagues have run, about half produced an outer solar system resembling ours. Most runs also leave a disk of debris beyond 40 AU that closely resembles the Kuiper Belt of icy objects discovered in 1992, beyond the mean orbital distance of Pluto.

    “It's a very interesting idea,” says Jack J. Lissauer of NASA's Ames Research Center in Mountain View, California. “They've shown it's not as unlikely as I would have thought.” But he and others question the realism of the model and note that no one is sure what the earliest solar system was actually like. Lissauer wonders, for example, if the modelers haven't put a bit too much debris in the outer solar system. Renu Malhotra of the Lunar and Planetary Institute in Houston believes cores would have interacted with gas and ice-rock bodies in the giant planet nursery where they formed. Such interactions, which don't appear in the model, would “tend to damp this violent physics” of throwing Uranus and Neptune out. “Whether the modeling reflects what happens in nature is not demonstrated,” she says.

    More realism will require more complicated modeling and more computing power, says Duncan. In the meantime, astronomers are pushing their telescopes to the limits to catch a glimpse of more-distant Kuiper Belt objects. In many of Thommes's simulations, Neptune scatters these objects high and low before settling down itself. Like debris from a brawl, scattered Kuiper Belt objects would be a strong sign that the giant planets had an early falling out.

  5. ANTHROPOLOGY

    Proto-Polynesians Quickly Settled Pacific

    1. Bernice Wuethrich*
    1. Bernice Wuethrich writes from Washington, D.C.

    The ancestors of the Polynesian people settled the islands of the East Pacific only a few thousand years ago, but anthropologists have long debated where they came from. Even after detailed scrutiny of bones, blood, DNA, pottery, and languages, researchers have remained divided. One camp, chiefly geneticists, argues that seafaring proto-Polynesians originated in Southeast Asia and quickly island-hopped eastward, sweeping through Melanesia in the West Pacific along the way. The other, chiefly archaeologists, argues that Polynesian ancestors originated in Melanesia itself, a hotbed of human diversity with a 45,000-year record of habitation. Now geneticists have combined DNA and linguistic data from Melanesia to make a new case that the archipelago was only a way station.

    The study, in last month's American Journal of Physical Anthropology, “adds important new primary genetic data” and is “strongly consistent with” a rapid intrusion, sometimes called the “express train” model, says Patrick Kirch, an archaeologist at the University of California, Berkeley. “This is important work,” adds geneticist Rebecca Cann of the University of Hawaii, Manoa. “We have the history of evolution in our bodies. If we decipher that, we have an independent and useful record of human history.”

    Still, not everyone is convinced the patterns are so simple. Although he praises the accumulation of data, archaeologist John Terrell of The Field Museum in Chicago says he's mystified “that people try to reduce 45,000 years of prehistory down to a story as simple as two peoples [Melanesians and Southeast Asians] and two migrations. It's a lot more interesting, complicated, and significant than that.”

    D. Andrew Merriwether of the University of Michigan, Ann Arbor, chose to examine Melanesian DNA and language more closely because, he says, “the islands are a critical crossroads to explain where everyone in the Pacific passed through or came from.” The languages spoken reflect that complexity: Some Melanesians speak Austronesian languages, as do all Polynesians, but other Melanesians speak non-Austronesian languages, dozens of which can be heard on the island of Bougainville alone.

    Merriwether and Jonathan Friedlaender of Temple University in Philadelphia, Pennsylvania, opened up lab freezers in the United States and Melanesia and resurrected blood samples taken from Melanesian islanders in the 1960s and ‘70s. They extracted mitochondrial DNA (mtDNA) from the blood and searched its nucleotide bases for a genetic signature shared by all Polynesian and many Southeast Asian people—a nine-base pair deletion that was presumably part of the genetic heritage the Polynesian forebears carried eastward.

    The researchers found that throughout Melanesia, the nine-base pair deletion is most common near the coast and absent in remote, hilly areas. And it is present in all Austronesian speakers, but only in some non-Austronesian-speaking groups. This pattern fits the idea of a rapid, relatively recent migration of ancestral Austronesian speakers through the area, Merriwether says. As the newcomers arrived on the already-inhabited islands, they settled along the coasts, introducing their languages and their genes. But these seafarers were slow to penetrate into the rugged, volcanic interior.

    Merriwether and Friedlaender found more evidence for a recent, rapid migration through Melanesia when they examined the sequence of some 500 bases in a hypervariable region of mtDNA known as the d-loop. The number of nucleotide differences in the d-loop sequences of any two individuals in a population provides evidence for how much time has passed since a small population began expanding—the more differences, the longer ago the population began to grow. Polynesians share a nearly identical d-loop sequence, varying by perhaps one nucleotide base. The same is true for Melanesians who have the nine-base pair deletion, indicating that both Polynesians and Melanesians who carry the deletion are recent descendants of the same small population. On the other hand, Melanesians who lacked the deletion had an average of seven to nine d-loop differences, evidence of a much older population that remained on Melanesia while the ancestors of Polynesians cruised the Pacific.

    Other researchers approve of combining genetic and linguistic data, but even some geneticists say that the results don't rule out other scenarios of settlement. Henry Harpending, a population geneticist at the University of Utah, Salt Lake City, says, “It's a lot of data that doesn't … consider other explanations such as migration from South America or genetic drift,” in which some genetic variation, lost by random chance in small populations, led to apparent genetic similarities today. More convincing would be data from “around 100 genetic loci,” he adds.

    As the Polynesian picture becomes clearer, researchers are probing other questions—such as why Melanesia is so incredibly diverse. “We are talking about very complex relationships … that reflect the influence of different migrations” at different times, Friedlaender says. And the right genes can continue to tease apart that history, he says.

  6. SCIENCE EDUCATION

    Science Fairs Pump Up the Rewards of Talent

    1. Laura Helmuth

    If you can't run the nation's most prestigious high school science fair, start your own—and make it even more lucrative for the winners. That's the genesis of the Siemens Westinghouse Science & Technology Competition, which announced its first winners this week in Washington, D.C. In doing so, Siemens prompted Intel, which last year beat out Siemens for sponsorship of the venerable Science Service Talent Search, to raise its prize money as well.

    Lisa Harris of Dalton High School in New York City won this year's top Siemens prize for individuals—a $100,000 college scholarship—for developing a new method to detect carriers of a gene responsible for cystic fibrosis. Daniar Hussain of Richland High School in Johnstown, Pennsylvania, and Steven Malliaris of New Trier High School in Winnetka, Illinois, won the team competition and will divide a $90,000 scholarship for their method of generating computer programs to store data more efficiently. A total of 12 individuals and teams chosen from six regional competitions competed for the top prizes; combined with awards to students who score well on advanced placement tests and their teachers, the total Siemens pot comes to nearly $1 million.

    Last year was tumultuous for the world of U.S. high school science competitions. Westinghouse ended 57 years of support for the annual Science Service Talent Search, which has been a scientific launching pad for five Nobel Prize winners and 30 members of the National Academy of Sciences. In a competition for a new sponsor, Siemens—which bought Westinghouse after it dropped its support—was one of 75 companies vying for the honor. After Intel won, Siemens decided to start its own contest.

    “It was enlightened self-interest,” says Albert Hoser, head of the Siemens Foundation, a corporate charity created in 1998. “We need employees at the cutting edge of science and math. We want to inspire students to give those subjects top priority.” Not to be outdone, Intel has doubled the prize money awarded to Science Service Talent Search winners, including $100,000 to be awarded in March for first place among 40 finalists, and is offering a total pool that exceeds $1.2 million. “We like to think we're serving as a catalyst for other corporations to raise the bar,” says Siemens spokesperson Esra Ozer.

    Indeed, one Nobelist urges other industrial giants to follow the lead of the German telecommunications company and the U.S. chipmaker. “The more the merrier,” says physicist Leon Lederman, who attended the Siemens awards ceremony. “What's wrong with General Motors? Why don't they have a prize, or Ford, or any other corporation?”

    Harris started working at the Public Health Research Institute in New York City at 16 and already is a co-author of a paper classifying the virulence of different strains of tuberculosis. For the competition, she used a fluorescent marker to tag blood samples that harbor a mutation in the cystic fibrosis gene, producing a test that is simpler and faster than current techniques. “I hit a wall over the summer,” she says about a project that took 11 months. “The most exciting thing for me was to make it work.” She finished the latest experiments just in time to write them up for her poster session at the awards ceremony.

    Hussain and Malliaris took a computer science class together in Illinois before Hussain's family moved to Pennsylvania earlier this year. “We knew the distance wouldn't be a problem,” says Malliaris, who collaborated with Hussain via the Internet. The students used evolutionary principles to improve polynomials that direct the storage of items from large data sets. They introduced random mutations in a population of techniques and then repeatedly eliminated the ones that were unfit and selected ones that distributed data most efficiently.

    The Siemens competition focuses on the quality of the student research projects, whereas the Intel talent search also judges contestants on creativity and their knowledge of science. In addition to these competitions, the National Science Teachers Association (NSTA) hosts three national contests that focus on a student's imagination—some are entirely conceptual—and don't require them to work in a lab outside school. “The Intel competition is for students already definitely going into science, who are already interested in science,” says NSTA spokesperson Cindy Workosky. “We're trying to turn them on to science.”

    For Lederman, every science competition “is another opportunity for the kids in America to excel.”

  7. TECHNOLOGY

    Nanotubes Generate Full-Color Displays

    1. Dennis Normile

    Tokyo—Liquid crystal displays (LCDs) are fast becoming ubiquitous—just glance at your wristwatch, laptop, or calculator. But they still can't compete with bulky cathode ray tubes (CRTs) for displaying high-quality images. Now, a team of Korean researchers at Samsung has produced a working display that promises to combine the quality of CRT images with the convenience of a flat panel by using carbon nanotubes as its source of electrons. Their work, published in the 15 November issue of Applied Physics Letters, gives Samsung a small lead in a heated race to commercialize the technology.

    The CRT is a dinosaur in the fast-changing world of electronics, surviving because of its brightness, resolution, and ability to show moving images. But it may be driven to extinction by a technology called field emission, which like the CRT uses electrons to light up colored phosphors on a glass screen (Science, 31 July 1998, p. 632). Instead of a single electron gun at the far end of a bulky and heavy cone-shaped tube, however, field-emission displays use thousands of tiny pointed electron emitters arrayed within a flat panel. An electric field pulls a stream of electrons from each point. Field-emission displays can be built as thin as LCDs, yet consume less power and promise to produce images comparable to a CRT. The problem to date has been finding materials that are easily fabricated into pointed shapes yet can hold up to the intense stream of electrons.

    Carbon nanotubes, which conduct electrons freely, could fit the bill—if researchers can find a way to control their fabrication and place them in a precise pattern. The Samsung group has done just that. The result, reported by Won Bong Choi and colleagues at the Samsung Advanced Institute of Technology in Suwon, South Korea, is the first full-color field-emission display using carbon nanotubes as the electron emitters. “It's an important step forward,” says Yahachi Saito, an associate professor of electrical and electronic engineering at Japan's Mie University, who is also working on carbon nanotube field-emission displays.

    The carbon nanotubes, cousins of the soccer ball-shaped fullerene carbon-60 molecule, are created by passing an arc discharge between graphite electrodes in a chamber filled with helium. The Samsung team mixes a conglomeration of single-walled carbon nanotubes into a paste with a nitrocellulose binder and squeezes the concoction through a 20-micrometer mesh onto a series of metal strips mounted on a glass sheet. As the nanotubes emerge from the mesh, they are forced into a vertical position. The researchers then heat the arrangement to burn off the nitrocellulose binder and melt metal particles in the paste. When the metal solidifies, it binds the nanotubes to the metal substrate. “Getting the nanotubes perpendicular to the substrate and evenly spread out is the key to getting even brightness in the finished panel,” Choi notes.

    The metal strips with the carbon nanotubes sticking out of them serve as cathodes, running from top to bottom of a glass sheet that serves as the back of the finished display. The front of the display is a glass sheet containing red, green, and blue phosphors and strips of a transparent indium-tin-oxide anode running from side to side. The glass sheets are separated by spacers. Once assembled, the edges are sealed and air is pumped out of the display. Each intersection where a vertical cathode and a horizontal anode cross forms a pixel. Each pixel is turned on or off by applying a voltage to its defining vertical cathode and horizontal anode. As with a CRT, an image is formed by setting the individual brightness and color for each pixel. “It's a very impressive display,” says Sumio Iijima, a microscopist at NEC Corp.'s Fundamental Research Laboratories in Tsukuba, Japan, who discovered carbon nanotubes in 1991.

    Samsung's 4.5-inch (11.4 centimeters) display could be the precursor of a new generation of more energy-efficient, higher performance flat panel displays for notebook computers and wall-hanging televisions. The carbon nanotubes appear to be durable enough to provide the 10,000-hour lifetime considered to be a minimum for an electronics product, and the panel uses just half the power of an LCD to produce an equivalent level of screen brightness. They should also be cheaper to produce than LCDs or other types of field-emission displays being developed, Choi says, adding that the technology also permits larger, more defect-free screens to be made than is possible with LCDs.

    Although the Samsung group has earned bragging rights with its full-color field-emission display, which it demonstrated earlier this year at a conference, it's only a step ahead of the competition. Ise Electronics Corp. of Mie Prefecture in Japan has since then shown a similar display based on Saito's work. At least five major Japanese electronics manufacturers are working on the technology, notes Saito, but their progress is a secret. “Only Samsung and Ise Electronics are [being] open about their research,” he says.

    Iijima praises the technical advances in the use of carbon nanotubes, but he says that “commercialization is another question.” Nanotubes must overcome the huge investment in the manufacturing of LCDs, which continue to fall in price. “LCD technology is improving very rapidly,” Choi admits. His field-emission displays are at least 3 years away from stores, he estimates.

  8. HUMAN GENOME RESEARCH

    German Effort Stuck in Minor League

    1. Michael Hagmann

    Munich—When Germany finally began making a contribution in the mid-1990s to the worldwide effort to sequence the human genome and utilize its information, the prevailing thinking in the scientific community was “better late than never.” Germany's research ministry stepped gingerly onto the human genome bandwagon in 1996 with a modest $23-million-a-year program for an initial 3-year period. The investment has resulted in some notable achievements, and several recent reports have recommended major increases in genome research funding. But at a meeting here last week that aimed to celebrate the results of the first phase of the German Human Genome Project (DHGP), researchers seemed to be already suffering from a hangover: Phase two of the DHGP officially began last month with an unchanged budget.

    “There's a lot of disappointment [in the community]. This is far from being enough. Compared to what we said we required, we're about 10-fold underfunded,” says geneticist Rudi Balling of the National Research Center for Environment and Health in Munich, adding that the pharmaceutical industry in Germany is “adding close to nothing” to the project. Hans Lehrach, a molecular geneticist at the Max Planck Institute for Molecular Genetics in Berlin, says that “Cambridge, Massachusetts, is spending more money than Germany, and just the construction of the Sanger Centre [Britain's main sequencing center near Cambridge, U.K., funded by the Wellcome Trust] was worth about 8 years of German genome research.”

    Despite its meager resources, the DHGP has made its mark on the international genome program. German and Japanese groups are the driving force behind the sequencing of chromosome 21, which is expected to be the next one finished after chromosome 22, whose sequence was published last week. DHGP researchers are also working hard to unravel the role of the thousands of previously unknown human genes by testing their function in model organisms such as mice or fruit flies. “In many ways, Germany is ahead of us because it centers not just on sequencing but also on how [the sequence information] functionally integrates,” says David Cox, a geneticist at Stanford University.

    In light of these successes, Germany's main granting agency, the DFG, last June recommended that an additional $550 million be spent on genome research over the next 5 years. And recently the Association for the Promotion of Human Genome Research, an industry group, together with the DHGP's scientific coordinating committee, argued for an overall annual budget of $280 million. The research ministry's refusal to heed this advice has plunged Germany's genome community into despondency. “There's a real danger that with genomics, just like with microelectronics, Germany is going to miss out on another scientific revolution—which will seriously jeopardize its future as a high-tech country,” says Lehrach. “It's like we're on the Titanic and nobody's interested in icebergs.” Balling admits that the scientific community has to take part of the blame. “Obviously we weren't able to communicate the prime importance of genome research. Space scientists, for instance, succeeded in speaking with one voice. That's something we still have to learn,” he says.

    At last week's meeting, researchers did express some hopes that, when the genome effort moves from brute sequencing to sifting through the data to tease out the roles genes play in cells and organisms, the DHGP's vice—a watering-can approach to funding—may turn into a virtue. “In the U.S. there are fewer and fewer people with bigger and bigger institutes [that get most of the funding]. For functional genomics that is not the way to go; you don't need gigantic institutions for that,” says Cox. But even small, specialized functional genomics groups need funding, and making this point loud and clear in Munich was a big step for the genomics community. Says Cox, “If policy-makers still don't understand [the impact of genome research], there's no hope for them.”

  9. ASTROPHYSICS

    Z Mimics X-rays From Neutron Star

    1. Andrew Watson*
    1. Andrew Watson is a writer in Norwich, U.K.

    When a massive object such as a neutron star or a black hole steals matter from a companion star, the process is anything but stealthy. Material cascading from the star to its greedy partner forms a disk and heats up, illuminating the in-falling material and surrounding gas with a blaze of x-rays in one of the cosmos's more spectacular displays. The spectrum of the x-rays could reveal the temperature and density of the accretion disk and the forces that shape it—if only astrophysicists knew how to read the spectral clues. Computer models are de rigueur, but until recently astrophysicists had nothing on Earth against which to check their results. Now, thanks to an earth-shaking experiment at Sandia National Laboratories in Albuquerque, New Mexico, they do.

    The key is Sandia Laboratories' Z-pinch machine, known to its users simply as Z and ordinarily used for defense and fusion research. To anybody who has derived a perverse pleasure from shooting sparks across the terminals of a car battery, Z is the ultimate thrill, the world's greatest fuse blower. At the flick of a switch, an 18-million-amp current surges through a cylindrical array of 300 fine tungsten wires, each just 10 micrometers in diameter. The immense current creates a magnetic field that squeezes the wires toward the center of the cylinder at nearly the speed of light. When the wires collide, they vaporize to create a hot ionized gas, or plasma, that spawns a mighty 120-terawatt blast of x-rays—making Z the world's most powerful x-ray source.

    “When you set off Z the whole building shakes, the ground shakes. You can feel it for a few hundred meters in any direction,” says Robert Heeter of Lawrence Livermore National Laboratory in California, a plasma physicist who will present the first results of the effort to mimic astrophysical x-rays with Z at a workshop at the Goddard Space Flight Center in mid-December. “It's very exciting.”

    Even more exciting for Heeter and his colleagues is what the x-ray blast does to an iron-foil target next to the cylinder. “Iron is a very popular element amongst astrophysicists,” says Mark Foord, the Livermore plasma physicist who heads the Sandia-Livermore collaboration along with Sandia's James Bailey. Iron, Foord explains, is easy to spot at x-ray wavelengths and is abundant in space. “It's found in almost every astrophysical body.”

    As a result, researchers often study iron as a stand-in for cosmic matter in general. But features of the spectrum of highly ionized iron are “notoriously complex” to predict, says Andrew Fabian of the Institute of Astronomy in Cambridge, U.K., which makes it difficult to turn iron spectra gleaned from satellites such as the Chandra X-ray Observatory into astrophysical know-how. “We can't calculate the relation between temperature, density, and line brightness directly; we have to observe it,” adds Stanford Woosley, an astrophysicist at the University of California, Santa Cruz.

    That is where Z comes in. Literally in a flash, it rips anywhere from 10 to 16 of the 26 electrons off each iron atom, converting the foil into a thin, highly ionized gas whose spectrum mimics that of the iron in the gas cloud of the cosmic x-ray source. “Instead of having a neutron star, we're using the Z machine x-rays as our x-ray source,” Foord says. “We're watching what the effects are of those x-rays on our target and trying to reach similar conditions that are found out in space.”

    The name of the game, says Woosley, is “to study how iron behaves in some of the most extreme conditions in the universe so that when we see emission lines of iron, we can use those lines to understand stars and black holes so dense and exotic that most of their emission comes out as x-rays.” One prize, Foord says, will be the ability to work out the sizes, shapes, and compositions of the accretion disks in binary systems. And if astronomers can quantify the relation between the accretion rate and the x-ray brightness, they could estimate how hungrily one partner in a binary system is gathering in matter—a clue to whether it is, for example, a white dwarf or a neutron star.

    Fabian thinks iron spectra may not bare all the secrets of neutron stars and black holes, because their motion and gravity will blur the x-ray emission. But he says the spectra “will be very important for modeling the gas in clusters of galaxies and stars.” Astronomers would like to study the x-rays emitted by the gas for clues to where shock waves are heating it and where supernovae have exploded.

    “What they're doing at Sandia couldn't be done elsewhere,” says astrophysicist Francis Keenan, who with his colleagues at Queens University, Belfast, is part of the worldwide consortium of groups working with the Sandia-Livermore team. But the accretion disk of knowledge will grow if funding allows more earth-shaking astrophysics with Z, along with studies on Z's smaller sibling, Saturn. “I think this work in general, of using plasmas in the laboratory to mimic very faraway plasmas in astronomy, is going to be very important over the next few years,” says Keenan.

  10. ASTROPHYSICS

    Space Becomes a Physics Lab

    1. Robert Irion

    Physicists and cosmologists are teaming up to plan visionary projects that would glean fundamental physics from relics of the big bang and from energetic corners of today's universe

    Rohnert Park, California—In his vision of the future of x-ray astronomy, Nicholas White hovers above the boundary of a black hole at the center of a distant, turbulent galaxy. He sees the fabric of space-time contort weirdly around the giant hole, traced by bright gashes of x-rays as hot gas plunges from sight. His vantage point gives him a clear view of the mechanism that blasts powerful jets of matter into space from just outside the black hole. Occasional wormholes or other exotic portals may pop into view.

    White's dream is not to inject himself directly into this relativistic mayhem but to study its details from afar, with a fleet of x-ray telescopes staring in unison from their orbit around the sun. Just as widely separated radio telescopes on Earth can act as a single outsized telescope to image fine detail in distant objects, an orbiting x-ray interferometer could probe the innermost workings of the most energetic bodies in the cosmos. “We've proved that black holes exist,” says White, an astronomer at NASA's Goddard Space Flight Center in Greenbelt, Maryland. “Ultimately, we want to know how they work, and this is the best way to do that.”

    The x-ray fleet, dubbed MAXIM for MicroArcsecond X-ray Imaging Mission, might not fly for 20 to 25 years, White acknowledges. That made it a perfect topic to broach at “Cosmic Genesis and Fundamental Physics,” an unusual workshop held here this fall at Sonoma State University.* About 150 leading astronomers, cosmologists, and physicists convened to gaze into the crystal balls of their fields in the early 21st century, looking for areas of common focus. Their eyes settled upon the origin of the universe, when the basic laws of physics were set, and violent corners of the universe today, which host energetic physics beyond the reach of particle accelerators. At the workshop, they described dozens of bold experiments that might glean new physics from the cosmos.

    For now, most of the projects have no funding or timetables, but the meeting organizers hoped the discussions would catalyze the planning needed for some missions to take wing in the next few decades. Traditionally, particle physicists and astrophysicists have blazed separate trails, with little communication and few interdisciplinary experiments. Two current exceptions, planned for launch by 2005, are the Gamma-Ray Large Area Space Telescope (GLAST), which will study supernova remnants, gamma ray bursts, and other cosmic particle accelerators, and the Alpha Magnetic Spectrometer, which will search for antimatter particles from a perch on the international space station. And NASA Administrator Daniel Goldin, for one, wants to see many more such collaborations in the new millennium.

    At a meeting earlier this year at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, Goldin urged physicists to look to the cosmos for insights about ultrahigh-energy particles and processes (Science, 4 June, p. 1597). He renewed that theme on 6 November in Irvine, California, at an address to the Board on Physics and Astronomy of the National Academy of Sciences. “If we are to make grand breakthroughs in fundamental physics, we need to pay more attention to this overlap of physics and astrophysics,” Goldin said. He envisions a new “Cosmic Journeys” program at NASA to support some of these projects and hopes other agencies—notably the National Science Foundation (NSF) and the Department of Energy (DOE)—will join in.

    Goldin's Fermilab address, and the prospect of a new crosscutting funding initiative, motivated cosmologist Rocky Kolb of Fermilab and physicists Elliott Bloom of Stanford University and Jim Siegrist of Lawrence Berkeley National Laboratory in Berkeley, California, to organize the Sonoma State meeting. Speakers were encouraged to proffer blue-sky ideas that look a decade or a quarter-century into the future, perhaps drawing upon technologies that are mere twinkles in their eyes today. “I didn't have very high expectations, but we're in wild agreement about the need to approach these problems on many different fronts simultaneously,” Siegrist said when the meeting ended. “This really could go somewhere”—if he and his colleagues can cross disciplinary barriers and make a unified case for the most promising efforts.

    Ripples in space-time

    Gravity in its many manifestations emerged as a favorite pursuit. Physicist Peter Bender of the University of Colorado, Boulder, described the Laser Interferometer Space Antenna, or LISA: three spacecraft arrayed in a triangle 5 million kilometers on a side. Laser beams would measure the position of a free-floating metal cube within each spacecraft to an accuracy of a tenth of an angstrom, relative to both other cubes. That's enough to resolve passing gravitational waves—ripples in space-time with waveforms and structures that, according to theory, are dictated by the relativistic physics near violent cosmic events. Whereas the gravitational-wave observatories about to start observing from the ground may see signs of powerful bursts, such as the collapse and explosion of massive stars, LISA's immense scale will make it sensitive to gravitational ripples with far longer wavelengths. Such stretched-out waves should emanate from phenomena that last hours to months—most notably, pairs of black holes locked in ever-tightening death spirals.

    If all goes well, LISA could fly within 10 years. Farther in the future, an improved array of satellites called LISA 2 might be sensitive enough to detect the gravitational-wave background of the universe itself—a buzz of gravity waves equivalent to the faint cosmic microwave background “glow” that pervades the sky. “That's incredibly exciting, because it would reveal the first instant of the universe's transparency to gravity waves,” says Bloom. That moment occurred when gravity decoupled from the rest of nature's fundamental forces, perhaps just 10−38 seconds after the big bang. The relics of that moment may reveal how gravity influenced the primordial quantum fluctuations that etched a template for the large-scale structure we see in the universe today.

    Those first gravitational waves also should have left subtle imprints on the cosmic microwave background itself, says cosmologist Marc Kamionkowski of the California Institute of Technology in Pasadena. Just as a lake polarizes light that reflects off the water, gravitational waves would have polarized the photons of the early universe at the moment that space cooled enough for them to escape the hot gas. The European Space Agency's Planck satellite may see hints of this shadowy polarization after its launch in 2007, but Kamionkowski advocates a dedicated post-Planck mission to scrutinize the polarization in detail. He notes that different scenarios for inflation, the wild burst of growth the infant universe is thought to have experienced, predict varying patterns of polarization in the cosmic microwave background. A satellite sensitive enough to discriminate among those patterns could settle the inflation debate and steer physicists closer toward understanding how the basic laws of nature operated in the first incandescent moments of the universe.

    Cosmic particle factories

    Other cosmic messengers are less subtle. Rare, ultrahigh-energy cosmic rays reach Earth carrying vastly more energy than terrestrial accelerators can generate: the punch of a brick dropped from a table, packed into a single particle. Physicists speculate that these wanderers may stream from the cores of active galaxies, gamma ray bursts, or spinning black holes, but no convincing theory exists about the astrophysical mechanisms that could boost particles to such extreme speeds. Ground detectors watching for the sprays of lower energy particles that an ultraenergetic cosmic ray excites as it slams into the upper atmosphere have spied only about 20 events so far, because the detectors monitor limited patches of the sky. Ultraviolet imagers in orbit could cover 10,000 times more area by watching the atmosphere from above, allowing Earth itself to serve as a kind of cosmic-ray display screen, says physicist Robert Streitmatter of NASA Goddard (Science, 5 December 1997, p. 1708). A pair of satellites, dubbed OWL for Orbiting Wide-Angle Light Collector, could trace the paths of the most energetic cosmic rays through the atmosphere, thereby pinpointing their sources and perhaps disclosing the nature of the objects that produce them.

    Detailed surveys of the universe in visible light also have much to offer in the next few decades, says astronomer Roger Angel of the University of Arizona, Tucson, including the prospect of mapping something that doesn't shine at all: dark matter. Invisible concentrations of matter bind together large clusters of galaxies and surround individual galaxies like our Milky Way with vast cocoons. Although the motions of galaxies and stars reveal the gravitational pull of this dark matter, no one knows what it is made of—perhaps dim clumps of ordinary matter or exotic particles that physicists have not yet been able to detect on Earth. A detailed sky map of the dark matter might offer clues to its composition, as well as nailing down the total matter content of the universe.

    Angel and his colleagues, including astrophysicist Anthony Tyson of Lucent Technologies in Murray Hill, New Jersey, have proposed a new telescope that they believe could chart dark matter in exquisite detail. The design of their “Dark Matter Telescope” takes advantage of the way these invisible mass concentrations warp our view of distant galaxies by acting as “gravitational lenses,” dimpling the fabric of space-time. The telescope would use an 8.4-meter mirror and two 4-meter mirrors in a radical configuration to map these subtle distortions across gigantic patches of the sky, six times wider than the full moon.

    The most mind-bending proposal for a dark-matter search may have come from Nobel laureate Martin Perl, a physicist at the Stanford Linear Accelerator Center (SLAC). Perl proposed scouring asteroids for massive particles that accelerators will never produce on Earth. These stable but ultraheavy particles, spawned in the big bang, might survive today in ancient material, such as asteroids or comets, as well as scattered between the stars. As a low-cost initial search, Perl's team plans to create tiny drops from meteorite fragments and measure the rates at which they fall in the laboratory. Although the drops, only about 10 micrometers across, will be too small to weigh easily, they will fall through air with different terminal velocities depending on their masses, perhaps revealing ultraheavy particles bound to ordinary matter within the meteorite material. But the best chance to spot them, Perl says, is to send a robot to an asteroid, where it could sift through the pristine soil and scan the terminal velocities of particles within the asteroid's minuscule gravitational field.

    Some of the proposals discussed at the meeting are in the preliminary funding pipeline at NASA or one of the other agencies. Others would require substantial new funding. And which to pursue first was a common hallway question. “Everyone has a program, and everyone's program is interesting,” says SLAC physicist Helen Quinn. “But how do you set priorities and decide which ones to do?”

    The astronomy and particle physics communities both have mechanisms for doing so, but their approaches are different. Astronomers convene a “decadal review” panel that ranks the importance of new projects over a 10-year period; the next report is due in the spring. Particle physicists have a High-Energy Physics Advisory Panel, which advises the Department of Energy—the field's main funder—four times a year. For interdisciplinary proposals like the ones discussed at the Sonoma State meeting, it might be wise to set up a new advisory committee that reports formally to NASA, NSF, and DOE, says Peter Rosen, DOE's associate director for high-energy and nuclear physics—perhaps along the lines of SAGENAP (Scientific Assessment Group for Experiments in Non-Accelerator Physics), nine physicists who have advised DOE and NSF on projects such as GLAST.

    But setting priorities is just a start. To attract funding for a major new initiative, astronomers and physicists will need to get savvy about marketing their plans, says Alan Bunner, science program director for the structure and evolution of the universe at NASA. “The added momentum of two or three agencies advocating the same initiative is hard to stop,” Bunner says. “But the American people have to be engaged. Intellectual interest is not enough.” He urged his audience at Sonoma State to distill the most exciting aspects of their proposed initiative and push them aggressively, both within the funding agencies and in their own communities.

    Siegrist and his colleagues have charted the next few steps along this path. In February, theorists will gather in Aspen, Colorado, to discuss which future experiments have the most potential to solve fundamental mysteries in physics and cosmology. Two months later, the organizers will meet in Washington, D.C., to prepare a white paper for the directors of DOE, NSF, and NASA. If the prospects in Congress look good for a new physics-astrophysics budget initiative, the group will plan the last of its initial meetings, probably a Snowmass, Colorado, conference in the summer of 2001.

    Another uncertainty may loom at that point: How well would astronomers and physicists collaborate when they face the nitty-gritty details of joint projects? “We have developed different traditions,” says Rosen. “We don't have a tradition in high-energy or nuclear physics to make data widely available to anyone,” because most complex detectors yield results only after painstaking analysis by the teams who built them. Astronomers, on the other hand, tend to build community instruments and quickly share what they find.

    The organizers of this budding movement think these disparate fields will find a way to cooperate, because the opportunities are irresistible. “Don't be afraid to take a big step if one is indicated,” Fermilab's Kolb urged his colleagues in a rousing final address at Sonoma State. “John Muir said that when one tugs at a simple thing in nature, he finds it hitched to the rest of the universe. Our laboratory for fundamental physics is now the universe itself.”

  11. PHYSICS

    Conjuring a Solitary Sound Wave

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    Spotted 150 years ago on the water of a canal and now routinely generated in light-carrying fibers, the solitary, long-lasting waves called solitons have now been seen in yet another medium: sound. In the 15 November Physical Review Letters, a team of researchers in Japan describes how they produced acoustic solitary waves by altering the propagation of sound through an air-filled tube.

    In 1834 a British naval architect, John Scott Russell, was the first to spot a soliton, racing away from the prow of a boat on the Edinburgh-Glasgow Canal. Working in a 9-meter tank that he built in his garden, Russell discovered that such waves survive because they avoid dispersing. A single water pulse contains waves at many different frequencies, and two effects balance out to keep the waves from separating. Lower frequency waves travel faster and would tend to outrun the pulse—except that the steepness of the water surface within the pulse speeds up higher frequency waves just enough to compensate. Optical fibers can also host solitons, because similar effects conspire to keep light pulses from dispersing in glass. But air—the usual medium for sound waves—behaves in just the wrong way to host a soliton. The speed of sound in air varies with intensity only, not frequency, which causes intense blasts of sound to pile up into shock waves.

    “Originally, my motive was to suppress shock formation in tunnels generated by the entry of a high-speed train,” says Nobumasa Sugimoto of the University of Osaka in Japan. Sugimoto and his colleagues took a steel tube, 7.4 meters long and 8 centimeters in diameter, and grafted onto it 148 so-called Helmholtz resonators, which resonate at specific frequencies. They sent sound pulses through the tube and tracked how the sound propagated. The team found that the pulses kept their smooth profile, without forming shock waves. What's more, the Helmholtz resonators apparently altered the sound speed depending on frequency, creating the right conditions for the acoustic solitary waves.

    Acoustics researcher Junru Wu of the University of Vermont, Burlington, who says he tried and failed to produce acoustic solitons 15 years ago in a similar experiment, thinks the work could have applications outside the laboratory. By adding side branches like those in Sugimoto's lab to exhaust pipes and other machinery, engineers might tame noise, transforming the shocks into milder solitary waves. And because the solitary waves can transport energy steadily over a long distance, Sugimoto and collaborators at Sanyo Electric Co. in Osaka are exploring potential applications in acoustic compressors, heat engines, and even heat pumps.

  12. SWEDEN

    Drug Research Endures the Pains of Globalization

    1. Joanna Rose,
    2. Annika Nilsson*
    1. Rose and Nilsson are writers in Stockholm, Sweden.

    Sweden's drug companies have joined the transnational drug industry, but their researchers are finding the global job market tough going

    Stockholm—It used to be easy to find work as a biomedical researcher in Sweden. The only tricky decision was whether to apply to Astra or Pharmacia, the country's two world-class drug companies. No longer. In 1995, Pharmacia merged with U.S.-based drug giant Upjohn, and since then the new company has cut its R&D personnel in Sweden by one-third, from 1500 to 1000. Last spring, Astra followed suit, merging with Britain's Zeneca to form the world's third largest drug company. And last week, the seemingly inevitable job cuts were announced: AstraZeneca plans to cut 450 positions from its research staff of 4000 in Sweden.

    View this table:

    The future of research at the two drug companies is not just a matter of national pride. Pharmaceutical research accounts for 14% of all privately funded R&D in Sweden, and AstraZeneca and Pharmacia & Upjohn (P&U) provide the lion's share. Both companies are cutting staff to eliminate duplication and concentrate R&D efforts in fewer labs. AstraZeneca, which currently employs about 10,000 people in R&D worldwide, is cutting back research in Sweden that overlaps with efforts in Alderley Park and Charnwood in the U.K. and in Wilmington, Delaware. The affected areas include research in inflammatory and respiratory diseases and cancer. Remaining in Sweden will be research on pain, gastrointestinal disorders, and heart diseases and a small part of the company's asthma research. Responsibility for diseases of the central nervous system will be split between Sweden and the United States. Research and development will be directed from Sweden, even though the company's headquarters are in London. In all, 1000 jobs will disappear in Sweden, Britain, and the United States.

    Reaction in Sweden to the announcement has been mixed. Union representatives accept the realities of consolidation in the world pharmaceutical industry and are currently negotiating how the reductions should be handled. Some people may get jobs in the company's expanding drug production in Sweden. Researchers, on the other hand, are expressing concern. “It's not good that whole competence areas move out of Sweden, leaving no natural recruitment need among young researchers,” says Ulf Lundkvist, former research director of Pharmacia, who is now working as a consultant on technology transfer. Hans Wigzell, head of the Karolinska Institute and a science adviser to the government, worries that students thinking about a research career in the pharmaceutical industry may feel less optimistic about their prospects in Sweden, and that, in turn, will affect recruitment to medical research. Olle Stendahl, general director of the Medical Research Council, expresses similar concerns. But he adds that companies may be moving research elsewhere because Swedish researchers are not up to scratch. “It's a warning signal to the government and research financing bodies that we have to flex our muscles. If we want to be attractive, we have to put real efforts into basic research,” he says.

    Adding to the nervousness among researchers is the fact that the companies' plans are in flux and some early promises have not been kept. Last year, for example, P&U announced plans to build a $140 million facility in Stockholm devoted to metabolic research, which is currently carried out by some 300 scientists in the company's labs in Uppsala and Stockholm. P&U's head of research hailed the decision as an important signal to researchers in both Sweden and other countries. The plans were abandoned this summer, however. Company spokesperson Stefan Dennsjö says P&U is now reviewing all major investments and is looking at alternatives, which could entail investment at a different level or a different structure. It's difficult to find any credibility in the company's planning when it can shift from month to month, said union representative Anders Måhl when he commented on the cancellation earlier this fall. “The management is only focused on earning money and flirting with shareholders and financial journalists,” he said.

    Some see a possible silver lining in these reversals, however. Lundkvist notes that many of those who lost their jobs in the Pharmacia-Upjohn merger went on to form small biomedical companies, leading to an explosion in small company growth in and around Uppsala. Entire research teams have been snapped up by other smaller Swedish companies. For example, the whole Pharmacia cancer research unit in Lund, which P&U planned to close down, was bought by the biomedical company Active Biotech. “A positive consequence of the new development is that the job market has become more flexible. Previously, Astra and Pharmacia recruited everyone,” Lundkvist says. Wigzell also hopes that “small new companies may emerge with new creative ideas,” but he believes that there has to be enough companies to create a critical mass. “The Chinese symbol for crisis conveys both threat and promise,” he says.

    Stendahl points out, however, that small companies often do contract work for the large corporations. “If the core disappears, the incentive to start small, growth-intensive companies will decrease.” The key, he says, will be to preserve enough cutting-edge research in Sweden.

  13. EARTH SCIENCE

    Terra Launch Spotlights NASA Observing System

    1. Andrew Lawler

    Next week's launch of the first of three satellites marks the debut of a smaller, cheaper version of a controversial effort to monitor Earth from space

    It has been restructured, rescoped, reviewed, and reviled. But next week the first of three major satellites to be launched under NASA's controversial Earth Observing System (EOS) program is scheduled to head into orbit. The 10-year, $7.25 billion program will gather data on how the planet's processes create climate and whether humans are seriously altering those processes. Along the way, NASA officials also hope to quiet concerns from scientists that the agency has gone too far in favoring hardware over data analysis and has reneged on its original long-term monitoring plan aimed at unlocking climate's deepest secrets.

    The scheduled 16 December launch from Vandenberg Air Force Base in Lompoc, California, will be both a triumph and a relief for NASA and the hundreds of researchers who have labored on the project. The 5000-kilogram, $1.3 billion Terra spacecraft will be followed in the next 3 years by Aqua and Chem; each carries an array of instruments that probe, at various wavelengths, everything from cloud development to sea temperatures to vegetation growth. Data from the trio will be supplemented by more than a dozen smaller U.S. missions and a variety of international missions under way or in preparation. “EOS represents a sea change” in the way earth scientists are able to monitor the planet, says Rafael Bras, a Massachusetts Institute of Technology hydrologist and chair of the NASA advisory panel that oversees the agency's earth sciences program. “For the first time, we are setting out to understand the [whole] Earth system.”

    That goal is about the only thing not changed since the program was conceived. The current EOS is a slimmed-down version of the massive platforms, intended to gather data for at least 15 years, that were envisioned in the 1980s (Science, 1 September 1995, p. 1208). The program has also suffered delays—Terra is 2 years behind schedule—and cost overruns, sniping by skeptical legislators, and complaints from scientists about its complex data system and insufficient research funds to complement the sophisticated hardware. With Terra on the launching pad, government officials are now searching for ways to keep the data flowing once the three spacecraft complete their mission.

    EOS began in the era of large, costly missions, before NASA Administrator Dan Goldin's “faster, cheaper, better” diktat. “It was done Soviet style—big and clunky,” recalls Jerry Mahlman, a Princeton atmospheric modeler and former chair of the NASA advisory panel for earth sciences. The initial concept was to build a few mammoth platforms carrying 30 instruments. Each would beam data to a complex centralized data system at NASA centers, which in turn would distribute information to everyone from researchers to high schoolers. The satellites would be replaced as they wore out, providing for continuous coverage. “This was not necessarily sold on science but on dreams, what's cool, and what's internationally spectacular,” says Mahlman.

    Those grandiose plans were scaled back, however, after predicted costs for EOS rose as high as $33 billion, provoking the concern of top NASA management and an outcry from Congress. A half-dozen redesigns later, EOS has evolved into a mix of small and large spacecraft, with a more decentralized data system and no immediate plans to replace the three major satellites. In addition, Congress in 1994 capped EOS spending at $7.25 billion.

    Ghassem Asrar, the NASA earth sciences chief who has headed the program since 1998, says that the present configuration provides “sufficient flexibility to focus on specific scientific questions while providing a continuous flow of data.” But not everyone agrees. “The frustration is that EOS started as a long-term observing system, but now we've screamed off in the other direction of individual research missions,” says Bruce Wielicki, a principal investigator for a clouds and Earth radiant energy experiment on both Terra and Aqua and a radiation scientist at NASA Langley Research Center in Hampton, Virginia. Wielicki and other researchers want long-term and continuous coverage, while others argue for fast and intensive missions. “There are two camps,” acknowledges Asrar, who says NASA has done its best to supplement Terra, Aqua, and Chem with separate missions to examine the effect of Earth's gravity on ocean circulation, the thickness of the world's ice sheets, ozone levels, and the amount of the sun's energy that falls on the planet's surface.

    There's also no commitment to extend these activities beyond 2006. NASA lacks a long-term, integrated science plan, notes a March 1999 National Research Council study on the fate of EOS beyond 2002. “EOS doesn't speak to continuity in … looking at how climate fluctuates,” says Mahlman. “It's up for a few years and then you bag it. And if you don't have continuous measurements, then what exactly are you doing?”

    NASA officials say that the three spacecraft could operate for a decade or more, twice their estimated lifetimes. One possible solution lies with a new generation of spacecraft, slated for launch in 2008 or so, that combines the fleets of the National Oceanic and Atmospheric Administration and the Defense Department. Scientists see them as a natural platform for their research instruments, alongside short-term weather forecasting equipment. “In the long run, the problem is that NASA is not the right agency to take on operational missions,” says Mark Abbott, an oceanographer at Oregon State University in Corvallis who is chairing a National Research Council committee studying a planned merging of government weather satellites. The committee, in a report due out early next year, is expected to recommend the launch of an interim satellite in 2005 or 2006 to plug the data gap; the topic is already under review at the White House.

    Although most scientists would applaud such a mission, they point out that orbital mechanics prevent a single spacecraft from meeting all their needs. Researchers interested in land processes, for example, prefer a satellite that follows the path of the early sun, gathering data in the morning, before clouds set in. But those interested in clouds and humidity want data from the afternoon. One way to tackle the problem, says Asrar, is for NASA and U.S. scientists to work with overseas colleagues to develop “an international strategy,” which might include a European platform.

    In the meantime, many researchers remain unhappy about the EOS data system, which will control the onboard instruments as well as process, distribute, and archive data. The question of access divides investigators, who want a system tailored to their needs, and government officials, who also want it available to the public. The revamped system is far less cumbersome than the original concept—it was “monstrously overgrown,” says Mahlman—and gives investigators much greater control over distributing and even processing their data. But the need to keep it open for public use rankles many. “You need different systems, one for gunslinger users and one that is friendly to high schoolers,” says Mahlman, who says he resigned as advisory chair 3 years ago out of frustration with NASA's pace in revamping EOS. Last year Goldin, under pressure from Senator Barbara Mikulski (D-MD), vetoed further attempts to simplify the system and restrict access.

    Scientists also chafe at the fact that well over two-thirds of EOS funding has been for hardware and feel that research has been given short shrift. “Is that cost effective and balanced?” asks Mahlman. Richard Somerville, a climatologist at the Scripps Institution of Oceanography in La Jolla, California, and a member of the NASA advisory panel, agrees. “It's a pity NASA is pinched when it comes to supporting research. The scientific payoff from the data could be greater,” he adds, if more funds were devoted to research. But Asrar defends the allocation. More than 20% of his office's $1.4 billion has been set aside for data analysis, he says, a figure that he hopes will rise to 30% within a few years. When funding from other agencies such as the National Science Foundation is taken into account, he notes, “I think this issue goes away.”

    From the beginning, however, some scientists have wondered whether the huge investment in EOS will yield equally huge results. Bras is upbeat, but some scientists say the program is hobbled by the fact that it has served more as a wish list for researchers than an exercise to answer specific scientific questions. As a result, says Richard Goody, a Harvard planetary physicist who was involved in the early planning, “EOS will be helpful, and we will learn something from it, but I don't think it moves us to an objective of making predictions from climate models more believable.”

    Those debates are likely to remain contentious for years. But next week they will be set aside as researchers hold their breath and hope for a successful launch. “The stakes are very high,” says Asrar. “The hopes and dreams of a large segment of the community are tied to this.”

  14. EUROPEAN UNION

    Research Chief Wants to Make Science Matter

    1. Robert Koenig

    Philippe Busquin wants science at the heart of EU decision-making, but Brussels bureaucracy doesn't give him much room to maneuver

    Brussels—Philippe Busquin, the European Union's new research chief, has a tough job ahead of him. For 2 decades, the EU has been pouring billions of euros into cross-border research projects, yet the community of European scientists remains fragmented and insular compared to their colleagues across the Atlantic. Research funding, on average, lags behind the United States and Japan; many postdocs would rather look for positions in North America than in another European nation; and because national governments jealously guard their right to make most of the big research funding decisions, the EU's research directorate has had a hard time bringing much cohesion to the European scene.

    Busquin would like to change all that. He wants to encourage more mobility and cooperation among Europe's top researchers, overhaul some of the EU's own research labs and big-ticket science programs, and bring the directorate's scientific expertise to the heart of EU decision-making. “We must give a new dynamic” to European research, Busquin told Science in an interview last month. “There is a wide range of issues in which science should have a major input” in Europe, he said, but boosting that input will require more coordination among the science policies of the member states and other European organizations.

    Can Busquin pull it off? In theory, he is well placed to put his stamp on European science. As one of 19 commissioners, he holds a position in the EU executive roughly equivalent to that of a Cabinet member in a national government. And as head of the EU's Framework 5 program, he commands a budget of nearly $4 billion a year—more than most research managers, such as the heads of the U.K. research councils or France's CNRS, have to play with. But even that amount represents less than 4% of Europe's total research spending, and it comes with some heavy political baggage. In general, the research ministries of France, the United Kingdom, Germany, and other major EU nations distrust the Brussels bureaucracy and keep a tight grip on decisions about “big-science” issues. Europe's major research facilities—including the CERN particle physics lab, the European Molecular Biology Laboratory, and the European Synchrotron Radiation Facility—are governed by direct treaties among participating nations rather than by the EU.

    Busquin, 58, who began his career as a physics lecturer, has been a politician for the past 2 decades. The political skills he honed as leader of the Socialist Party in Belgium's French-speaking region clearly will be needed in his new job.

    He got off to a shaky start during confirmation hearings in September, when conservative members of the European Parliament threatened to scuttle the entire slate of new commissioners because they questioned whether Busquin—in the wake of the cronyism charges that plagued his predecessor, Edith Cresson—might be compromised by past financial scandals involving his Socialist Party in Belgium. But since then, Busquin has tried to build rapport with both the Parliament and his directorate. Unlike Cresson, whose main office was in the commission headquarters, Busquin has moved into the research directorate building. And the generally friendly reception he got last month at his follow-up appearance before the Parliament's research committee contrasted starkly with the harsh criticism he endured at his stormy hearing in September.

    At last month's hearing, Busquin outlined his priorities and promised to deliver a policy paper in January on his ideas for a “European research area”—an effort to make European research more cohesive, focused, mobile, and multilateral, as well as more open to women and young scientists. He hopes the document will spark a debate among European scientists, political leaders, and industry.

    One issue that Busquin must face is the increasingly vocal complaints about the EU's flagship Framework 5 program, which funds cross-border projects in an array of fields. The Framework programs are popular among many researchers—especially those in the poorer EU member states, such as the Irish Republic, Portugal, and Greece—and the program is now open to scientists in 10 central and eastern European countries being groomed for eventual EU membership. This is increasing the competition for funds: In recent months, the research directorate has received more than 10,000 applications for Framework 5's first round of grants, and it will be able to fund less than 25% of them. “Our first impression is that [Framework 5] has been well received,” says Busquin. But many European scientists have criticized Framework for being excessively bureaucratic, and Busquin said the research directorate “needs to take such complaints seriously,” adding: “We want to become more cost-effective, but we also want high-quality research. It's not always an easy equation to solve.”

    Moving toward a solution will require some tougher policing of Framework 5. EU auditors last month criticized what they called “inaccurate declarations” and “insufficient controls” of EU research spending during Framework 4. In addition to such legacies, Busquin still has to deal with cantankerous national governments. European biologists recently attacked Framework 5 for withdrawing infrastructure funding from several life sciences facilities, including the European Mouse Mutant Archive (EMMA) near Rome. EMMA was set up using seed money provided by Framework 4, but member states—still determined to keep international facilities under their own control—decided earlier this year that Framework 5 should not pay for the infrastructure of facilities. That left EMMA starved of funding. Powerless to counter that decision, Busquin wants to “catalyze” efforts to help raise money for EMMA—for example, by arranging for funding by national research councils or outside scientific groups. He brought up EMMA at a 2 December meeting of European research ministers as an argument for the need to create more synergy among research efforts in Europe. An aide to Busquin told Science that, in the long term, Busquin “believes we need to rethink the concept of partnership between the commission and research infrastructures which are of European interest.”

    Also on Busquin's agenda is what to do with the EU's Joint Research Center (JRC). What began 40 years ago as a nuclear research facility has since evolved into a sprawling collection of eight institutes at several sites across the EU, covering everything from consumer protection to space science applications, at an annual cost of more than $1 billion. The European Parliament has criticized the JRC for being unfocused and for duplicating the work of national agencies, but Busquin thinks the center can play an important role in supporting EU policy-making. “We have to open up this [JRC] debate to include inputs from many others,” he said. Busquin recently formed a group of scientists, European parliamentarians, and industry experts to scrutinize the JRC. But although Busquin and the commission have some leeway about the JRC's structure and its use of funds, the wider policy decisions about its mission are made at the political level by European leaders in consultation with the Parliament.

    One area where Busquin wants EU facilities to play a larger role is that of food safety, following recent heated debates over beef infected with bovine spongiform encephalopathy, dioxin-laced poultry, and genetically modified crops. Commission President Romano Prodi has called for the creation of an EU-wide food safety agency. Busquin said he favors the concept, but he cautioned that the creation of a pan-European agency will face the same problem that hampers so many EU initiatives: how to bring coordination without treading on the toes of national governments, which have their own safety agencies and laws. Although food safety is only partially within his purview, Busquin believes the JRC could play a role in this area. He favors the creation of “a network of approved laboratories, so that we could have some degree of unity on how to tackle these food-safety issues, how to carry out the checks and controls that are needed.”

    As Europe enters the new millennium, Busquin says he wants science to be on the cutting edge of EU policy-making and its enlargement into central and eastern Europe: “Scientists and the research community have an essential role to play in a democratic world that is changing at a rapid pace.”

  15. ENVIRONMENTAL SCIENCE

    Biocomplexity Blooms in NSF's Research Garden

    1. Jeffrey Mervis

    Director Rita Colwell hopes to persuade both politicians and scientists that a hard-to-define term is the hottest research game in town and a key to understanding the world

    Biocomplexity. It's been a buzzword for Rita Colwell since she became director of the National Science Foundation (NSF) 16 months ago. Now, the nebulous concept is gradually taking on form and substance as the centerpiece of an expanding environmental research portfolio at the $4 billion agency.

    On 3 December NSF spelled out the rules for a $50 million special competition for research on the topic—the second since Colwell took office (NSF 00-22). Its call for interdisciplinary proposals “to better understand and model complexity in biological, physical, and social systems” comes as NSF officials are lobbying hard for $30 million in the president's 2001 budget request as a downpayment on a $100 million National Ecological Observatory Network (NEON), a collection of outposts that would provide researchers with the capacity to do high-tech fieldwork on biocomplexity. Both initiatives are consistent with an upcoming report by NSF's governing body, the National Science Board (Science, 6 August, p. 816), that recommends that the agency nearly triple its environmental spending, from $600 million to $1.6 billion a year, over the next 5 years. And more than money is involved. Next month, paleoceanographer Margaret Leinen of the University of Rhode Island (URI) takes over as NSF's first environmental “czar,” with responsibility for integrating these new programs into a portfolio of environmental research that ranges from understanding magnetic storms in space to exploring life in deep-sea vents (see sidebar).

    “Biocomplexity is a multidisciplinary approach to understanding our world's environment,” Colwell told a congressional panel earlier this year. “For generations, scientists have studied parts of our environmental system—individual species and habitats—in isolation. Now it is time for a better understanding of how those parts function together, as a whole.”

    Even NSF staff involved in some of the biocomplexity initiatives confess that they don't yet have their arms around the concept. “It's still evolving,” says one program officer. “It's not at all clear how it will shake out.” The concept may not be a whole lot clearer to scientists in fields most likely to benefit from the jump in funding, but they like what they've heard so far. “We think it's great, and we're eager to learn more about it,” says Alan Covich, an ecologist at Colorado State University in Fort Collins and president-elect of the 66-society American Institute of Biological Sciences (AIBS), which has already chosen “From Biodiversity to Biocomplexity” as the theme for its 2001 annual meeting. “What's most exciting about biocomplexity,” says Covich, who studies the interaction of marine life, vegetation, and land use on a freshwater system in Puerto Rico, “is its emphasis on the importance of scale, from micro to macro, as well as its inclusion of the social and behavioral sciences—economics, anthropology, and geography—into studies of the ecosystem.”

    Within months of arriving at NSF in August 1998, Colwell had tapped into a discretionary fund to support a competition for research proposals on how microorganisms affect larger biological, chemical, geological, and even social systems. In late September NSF awarded $13.3 million—twice the amount originally planned—to five teams of scientists in what was called biocomplexity, phase I, with two more awards in the pipeline. “We were grinning for weeks,” Caroline Bledsoe of the University of California, Davis, recalls after receiving $4.8 million for a 5-year, three-institution effort to study the interactions of mycorrhizal fungi, plants, and soil in the transfer of carbon and nutrients. Other projects include examining how nitrogen fixation in the oceans affects global climate and analyzing how the genetic makeup of symbiotic bacteria affects the plant-feeding insects that act as their hosts.

    The overwhelming response to that hastily assembled special competition—NSF got 118 preproposals, and 34 groups were asked to submit a more detailed package—suggested that NSF had struck a chord in looking for ways to meld diverse approaches to studying large and varied ecosystems. “For 30 years we've taken a reductionist approach, and we have collected a lot of data,” says Mary Clutter, head of the biology directorate and coordinator for the initiative. “Now we want to synthesize it and come up with models and theories that explain much broader phenomena.”

    The just-announced second competition, drawing on the $50 million approved in October by Congress for the current 2000 fiscal year, encourages researchers to think in the broadest possible terms. In an attempt to be helpful, the solicitation describes the sorts of problems that researchers might want to attack: Do the physical arrangement of genes in our DNA and neurons in our brains share patterns with the underground network of bacteria, fungi, and plant roots that nourish the planet, and can computers be used to explore such large-scale networks? How do current property laws and age-old cultural norms affect the rate of deforestation in tropical regions? And what is the mechanism by which human activities affect species diversity? “It's not more of the same thing,” says Amy Ward, director of the Center for Freshwater Studies at the University of Alabama, Tuscaloosa, and president of the Association of Ecosystem Research Centers. “It's a more organized effort to study complex problems, and it sets the stage for tackling more critical issues down the line.” Letters of intent are due 31 January, with full proposals a month later.

    NSF has also set aside $5 million for those interested in pursuing such topics but who aren't quite ready to take the plunge. The money would go for small grants to finance workshops, real or virtual, for like-minded scientists who otherwise might not interact. “We want people from different disciplines to address the big questions, but we know not everybody is ready to do that yet,” says Clutter.

    For those who are, and who need field data, there's NEON. Each of its stations would be equipped with such instruments as gene sequencers and sensor arrays, as well as sufficient computing power to handle petascale databases and to carry out complex simulations. The technological firepower would allow scientists to modify their collection efforts immediately rather than waiting for the next season. Clutter, whose directorate is overseeing the initiative, hopes to hold a competition next year to choose three sites, and to build as many as 10 over 3 years. Each station would cost $10 million to build and equip. “This is a new concept for field biology,” says NSF's Scott Collins, who is organizing meetings in January and March for scientists to hash out both the concept and the technological requirements for the chain of observatories. “They would be like telescopes, with a 30-year life-span, and together they would give us the capability to do high-tech, long-term experiments on the scale of up to an entire continent.”

    The biggest unanswered question for NSF administrators is how the biocomplexity initiative and its relatives, including NEON, fit into existing environmental research programs at the agency. A few years ago, for example, NSF launched a small interdisciplinary program, Life and Earth Environments, that explores the interaction of biological and physical systems. And for the past 20 years NSF has supported a network of Long-Term Ecological Research (LTER) sites that carry out fieldwork in a variety of ecosystems, a component of the new NEON stations. Indeed, LTER sites will be eligible for NEON awards.

    The next step for NSF officials is to figure out how to weave everything together in a format that's both intellectually rigorous and politically compelling. “Biocomplexity provides the spark that can lift the entire [environmental] portfolio,” says Marge Cavanaugh, a chemist who heads an internal panel looking at how to implement the science board's recommendations. “It's an opportunity to take a conscious look at whole systems, including the necessary technology and human interactions needed to understand them.”

    That, in large part, will be the job of URI's Leinen. And Colwell's political success to date gives her a head start. For example, NSF's biocomplexity initiative is faring better in Congress than a new and related interagency effort called Integrated Science for Ecosystem Challenges (ISEC), for which the Administration sought $96 million to explore, monitor, and combat such environmental assaults as invasive species, eutrophication, and habitat destruction. NSF received its full $50 million request for biocomplexity, and $8 million for its slice of ISEC, by arguing that its role is to support the basic research that precedes and underlies any response to environmental assaults. In contrast, Congress cut by nearly 80% the proposed budgets for the other five, more mission-oriented agencies—including the geological survey, the agriculture department, and the forest service. “Biocomplexity is an NSF initiative, to fund basic environmental research,” Colwell says. And NSF's contribution to ISEC, she adds, is to provide the knowledge that forms a basis for action under ISEC.

    In fact, NSF's success in promoting its environmental agenda may make the other agencies a bit jealous. “We're extremely pleased that NSF is faring so well, and that university-based research is being strengthened,” says the Interior Department's Mark Schaefer, who oversees ISEC. “But the applied programs need resources, too.”

    A few days before Colwell was confirmed as NSF director, she told an AIBS audience that humanity's goal for the 21st century ought to be “to understand, and learn to keep in balance, the biocomplexity of all of Earth's ecosystems … and to tease out from those subtle but sophisticated interactions the principles of sustainability.” Since then, she's made it clear that she wants NSF to play an important role in that quest.

  16. ENVIRONMENTAL SCIENCE

    NSF's New Czar for Environment

    1. Jeffrey Mervis

    Paleoceanographer Margaret Leinen wears many hats at the University of Rhode Island (URI)—as dean of the graduate school of oceanography, vice provost for marine affairs, and interim dean of the school's college of environmental and life sciences. That broad purview will stand her in good stead when she comes to Washington, D.C., next month to be the National Science Foundation's (NSF's) first environment czar, responsible for coordinating environmental science and engineering programs. Leinen, 53, will also be replacing Robert Corell as head of NSF's $500 million geosciences directorate, one of seven research directorates at the foundation and a major player in many of NSF's environmental activities.

    Leinen, who studied sedimentation in the ancient oceans and its relationship to climate change before becoming an administrator, says she relishes the chance to run the geosciences program. But it's the novel role of environmental czar that led her to take leave from URI, where she's spent her entire professional life, and accept an invitation from NSF director Rita Colwell to join the federal government. “Rita is obviously committed to closer collaboration between the social and the natural sciences,” says Leinen, and so is she. In fact, Leinen said that a chance to wear the environmental hat helped to seal the deal after she topped NSF's list during an extended search to replace Corell, who has led the directorate since 1987.

    Leinen's first challenge will be to define her role in such existing activities as NSF's new biocomplexity initiative (see main text), now run by biology head Mary Clutter, and the U.S. Global Change Research Program, a $1.6-billion-a-year interagency colossus that Corell has chaired since its inception in 1990. It may be hard to avoid stepping on toes, but her colleagues at URI say she should be able to avoid major collisions. “She gets along with everybody, and she's very personable,” says John Knauss, a professor emeritus at URI and former dean who has known Leinen since she was a graduate student. “She knows how to hit the right notes in managing people.” Adds Ted Moore of the University of Washington, Seattle, a former mentor and colleague, “I think that geosciences at NSF will be in good hands with Margaret at the helm.”

    Although reluctant to spell out her priorities, Leinen says she believes environmental scientists need to play a larger role in educating students and the general public about how science and technology affect their lives. “We especially need to talk to minority communities about the need for them to be more active,” she says. “We have not done a good job of informing them about the [environmental] issues they are facing.” It's clear that Leinen, like her boss, is ready to think broadly about environmental science and NSF's role in advancing it.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution


Navigate This Article