News this Week

Science  22 Aug 2008:
Vol. 321, Issue 5892, pp. 1026

    FBI Discusses Microbial Forensics--but Key Questions Remain Unanswered

    1. Yudhijit Bhattacharjee,
    2. Martin Enserink

    Environmental workers donned protective gear on 29 November 2001 before attempting to clean up the Hart Senate Office Building after the anthrax attacks.


    WASHINGTON, D.C.—Facing growing public skepticism, impatient politicians, and a blogosphere rife with conspiracy theories, the Federal Bureau of Investigation (FBI) on Monday sought to bolster its case that Army scientist Bruce Ivins was the perpetrator of the 2001 anthrax attacks by extensively discussing the scientific evidence. Backed by six outside researchers who lent a hand in the investigation, FBI officials explained during two press briefings how they linked spores from the envelopes sent to Congress and the media back to a flask at Ivins's lab at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID). Although the agents did add a few new details to the picture that has emerged over the past 3 weeks, several scientists say that many questions still remain unanswered (see sidebar).

    The briefings confirmed the scientific process that sources close to the investigation, as well as independent microbiologists who analyzed a key court document, had described before (Science, 15 August, p. 898), a process that relied on whole-genome sequencing to find four mutations unique to the spores used in the attacks. The bureau also hammered home that, in contrast to widespread reports, no special additives, such as silica, were added to the spore preparation, and that Ivins—who killed himself on 29 July—would have been able to produce the letters on his own 7 years ago using standard lab equipment.

    To reinforce the scientific message, the FBI took the unusual step of organizing two briefings, one for reporters from scientific journals, including Science, so that it had ample time to delve into the scientific nitty-gritty, and the other for the general press. It also brought in microbiology heavyweights such as Rita Colwell, who helped fund the sequencing of the attack strain in 2001 while she was head of the National Science Foundation.

    Dangerous mail.

    Senator Tom Daschle's office received one of the letters tainted with the deadly anthrax spores, later found to be from the widely used Ames strain (above).


    More answers may come out over the next couple of months, because the bureau has given its scientific collaborators permission to discuss the case and publish their findings. Claire Fraser-Liggett, the former head of The Institute for Genomic Research (TIGR) who is now at the University of Maryland School of Medicine in Baltimore, says she and her colleagues expect to have a draft manuscript about their piece of the puzzle ready for submission in a month's time. The numerous “technical reports we have had to produce for the FBI over the years is making manuscript-writing that much easier,” she says.

    A key moment in the investigation came when TIGR researchers sequenced the anthrax sample taken from the spinal fluid of Robert Stevens, the first victim of the anthrax letter attacks that ultimately killed five people and sickened 17. When they compared that strain with the first anthrax strain ever sequenced—a project finished about the time of the attacks—they found a few small differences between the two, as they reported in Science (14 June 2002, p. 2028). Because Bacillus anthracis is extremely homogenous genetically and both isolates came from the Ames strain, researchers “had no idea” that they could reliably pinpoint a few changes in its genome, says Jacques Ravel, a former TIGR researcher now at the University of Maryland School of Medicine. The finding gave them confidence that they might be able to differentiate the anthrax samples stored at dozens of labs, says Fraser-Liggett—and finger the one involved in the letters.

    The FBI confirmed on Monday that a preliminary analysis at the U.S. Centers for Disease Control and Prevention showed that the powder in each of the four letters was not a uniform population of cells but a mix that, when grown in a petri dish, yielded some colonies that were different in texture, color, and size. Paul Keim's group at Northern Arizona University in Flagstaff extracted DNA from these variants and supplied it to TIGR, where 12 DNA samples were fully sequenced. Four mutations—all of them insertions or deletions of a DNA fragment, or “indels”—were then selected, and FBI and independent scientists developed assays to look for them in a repository of more than 1000 samples of the Ames strain that the FBI had collected from labs in the United States, Sweden, the United Kingdom, and Canada. The samples were tested at about a dozen labs, including TIGR.

    The FBI disclosed earlier that only eight samples had all four mutations; on Monday, it said that all but one of these came from USAMRIID. And all eight could be traced to RMR-1029—the flask of spores under Ivins's charge.

    That is as far as the science took them, the FBI conceded; conventional detective work—such as checking lab notebooks and shipment records—helped rule out everyone but Ivins who had access to the spores, says Vahid Majidi, head of the agency's Weapons of Mass Destruction Directorate. He declined to give details. “I'm asking you not to second-guess our investigative approach,” he said.

    An analysis by materials researcher Joseph Michael at Sandia National Laboratories in Albuquerque, New Mexico, convinced the FBI that no silica or other chemicals had been added to the anthrax in the letters, as an earlier analysis by the Armed Forces Institute of Pathology (AFIP) had indicated. Transmission electron microscopy by Michael and colleagues revealed that the silicon AFIP researchers had detected in the samples was contained inside the spores—a natural occurrence documented in previous research—and not in a coating intended to make the anthrax disperse more easily.

    Many experts had believed that only special coatings or electrostatic charges—which would require sophisticated chemical and physical expertise—could explain why the powder in the Senate letters seemed to float so easily (Science, 28 November 2003, p. 1492). If the bureau is right, “that makes it more frightening in a way,” says bioterrorism expert Michael Osterholm of the University of Minnesota, Minneapolis, as it suggests that people with no special skills in those fields could make an efficient, deadly powder.

    Inside job?

    The FBI has alleged that the anthrax letters were sent by a scientist working at a high-security lab at the U.S. Army Medical Research Institute of Infectious Diseases.


    During the general press conference, Majidi admitted to a misstep in the investigation. In 2002, the FBI destroyed an anthrax sample that Ivins had submitted from his lab in February because it had not been prepared in accordance with a court-directed protocol. Keim held onto the mirror sample sent to his lab until 2006, when investigators realized it could be probed for the four mutations. (By then, investigators had already confiscated the RMR-1029 flask from Ivins.) The earlier sample was eventually shown to have all four mutations, linking it to the envelopes; but samples that Ivins gave to agents a few months later, in April 2002, did not, leading the FBI to suspect that he was trying to mislead them the second time around.

    Throughout the process, the agency enlisted dozens of scientists to vet each step in the investigation and the results of the analyses to try to ensure that they would stand up to scrutiny in a court of law, says D. Christian Hassell, director of the FBI Laboratory. Groups of scientists were invited for “red-team reviews” to scrutinize the evidence, Hassell said.

    Scientists involved in the case say they are relieved that after many years, they can now discuss and publish the evidence. “Suddenly, it's very relaxed,” says Keim, who said he was handed a document releasing him from the nondisclosure agreement an hour before he took part in the first press conference. Adds virologist Thomas Geisbert of Boston University, “All of us who have handled the evidence have asked the FBI, and pleaded with them, to allow us to release it.”

    Because the evidence is unlikely to be challenged in court—the FBI expects to close the case soon—there should be a “court of science” instead, convened by the National Academy of Sciences or another independent body, suggests Osterholm. That would help alleviate the public mistrust of the official investigation and counteract conspiracy theories, like those surrounding the assassination of President John F. Kennedy, he says. But Majidi seemed resigned to the idea that the doubts will never go away entirely: “There's always going to be a spore on a grassy knoll.”


    Six Anthrax Science Questions the FBI Has Yet to Answer

    1. Yudhijit Bhattacharjee,
    2. Martin Enserink
    • What were the four mutations the FBI says it used to link the anthrax in the envelopes to Bruce Ivins at USAMRIID?

    • What are the odds of a false positive—that is, the odds that the spore populations in Ivins's flask RMR-1029 and in the envelopes weren't related but shared the same four mutations by chance?

    • Eight samples had anthrax with all four mutations; one of those came from a lab other than USAMRIID. On what basis was this lab ruled out as the origin of the letters?

    • How did the FBI rule out the possibility that others at USAMRIID with access to Ivins's lab prepared the envelopes?

    • How exactly did Ivins, if he was the perpetrator, produce an easily dispersible powder from his anthrax culture?

    • What led the FBI to suspect Steven Hatfill in the earlier years of the investigation?


    Pumping Up the Tibetan Plateau From the Far Pacific Ocean

    1. Richard A. Kerr

    Explaining how an area the size of Alaska got to be higher on average than the highest peak in the contiguous United States doesn't seem all that difficult: Just blame India. The roving subcontinent plowed into Eurasia beginning 50 million years ago and hasn't stopped yet. When it comes to working out the details, however, the Tibetan question remains the most contentious in tectonics. “There's a lot of gabbing, and we're not much further along than we were 20 years ago,” says geochemist T. Mark Harrison of the University of California, Los Angeles (UCLA).

    The problem is that researchers can't see much of what's going on beneath the plateau. Is the underlying rock strong and rigid or flowing like molasses? On page 1054, geoscientists Leigh Royden, B. Clark Burchfiel, and Robert van der Hilst of the Massachusetts Institute of Technology (MIT) in Cambridge argue that rock has flowed west to east beneath the plateau to inflate its eastern side and that the flow has been throttled by tectonic doings as far as thousands of kilometers away.

    There is some agreement about how the Tibetan Plateau story begins, at least. In the western portion of the collision zone—where India smashed into the rigid and resisting block of the Tarim Basin—the colliding continents squeezed the intervening rock into the great Karakoram Range. To the east, with hot, weak rock and no rigid backstop in the way to strengthen the continents' grip, the collision sent great chunks of rock eastward. Piled 5 kilometers high to the west, crustal rock weighed heavily on the rock beneath, driving fragments toward southeast China and Indonesia without raising much of a plateau in the east.

    This eastward extrusion, the MIT group is now pointing out, coincided with stirrings of the plates of the western Pacific. There, ocean plates were diving into deep-sea trenches and heading under Asia. The trenches, meanwhile, were rapidly retreating from the continent. This trench retreat made room for the crustal rock being extruded eastward and southward from Tibet, the MIT group reasons. When the trench retreat ground to a halt between 30 million and 20 million years ago, the researchers argue, the extruded material began to pile up, thickening the crust faster than before and thus pumping up the plateau.

    Another, much more local gating of flow explains why the central plateau thrust upward about 10 million years ago, the group suggests. Normally, such a rise would leave deformed rock and other signs that the crust had been compressed and squeezed. No such compressional markings have been found in the east. Instead, north-south faults in the eastern plateau show that the crust was actually stretching east and west. To explain the paradox, the MIT group posits that a “dam” of strong, rigid rock beneath the plateau busted about 15 million years ago. Deep, weak crustal rock gushed eastward, stretching the central plateau by its departure and pushing up the eastern plateau. Traces of that rock monsoon may have turned up in seismic images of an apparently weak crustal layer deep beneath the eastern plateau, which van der Hilst and colleagues published earlier this year in Geophysical Journal International.

    Not everyone agrees that crustal flow was crucial. In the May issue of Geology, seismologist Paul Silver of the Carnegie Institution of Washington's Department of Terrestrial Magnetism in Washington, D.C., and colleagues presented seismic probings of the “fabric” of the crust and deeper mantle rock of the plateau. The strong similarity of crustal and mantle fabrics “disallows the lower crustal flow model” of the MIT group, Silver says, because such flow would have disconnected the crust from the mantle, leading to different fabrics. As to the apparent absence of signs of squeezing in the eastern plateau, the heavily forested eastern area “is a good place for panda bears but a bad place for geologists to make geologic maps,” writes geoscientist An Yin of UCLA in an e-mail.

    Informative disaster.

    The Sichuan quake holds clues to what has driven the rise of the Tibetan Plateau.


    May's devastating earthquake in the Sichuan region of China on the eastern edge of the plateau might eventually help resolve the Tibetan question. Burchfiel, Royden, van der Hilst, and colleagues argue in the July issue of GSA Today that the quake mainly reflects slow uplift, consistent with crustal flow. But so far, others see dramatic evidence of severe squeezing and crustal deformation. And how the deeper crust and mantle slowly recover from the shock of the quake could reveal just how weak and prone to flow the deeper rock of the plateau actually is.


    'Simple' Animal's Genome Proves Unexpectedly Complex

    1. Elizabeth Pennisi

    Aptly named “sticky hairy plate,” Trichoplax adhaerens barely qualifies as an animal. About 1 millimeter long and covered with cilia, this flat marine organism lacks a stomach, muscles, nerves, and gonads, even a head. It glides along like an amoeba, its lower layer of cells releasing enzymes that digest algae beneath its ever-changing body, and it reproduces by splitting or budding off progeny. Yet this animal's genome looks surprisingly like ours, says Daniel Rokhsar, an evolutionary biologist at the University of California, Berkeley (UCB) and the U.S. Department of Energy Joint Genome Institute in Walnut Creek, California. Its 98 million DNA base pairs include many of the genes responsible for guiding the development of other animals' complex shapes and organs, he and his colleagues report in the 21 August issue of Nature.

    Simple—or simplified?

    It's a puzzle why Trichoplax, a seemingly primitive animal, has such a complex genome.


    Biologists had once assumed that complicated body plans and complex genomes went hand in hand. But T. adhaerens's genome, following on the heels of the discovery of a similarly sophisticated genome in a sea anemone (Science, 6 July 2007, p. 86), “highlights a disconnect between molecular and morphological complexity,” says Mark Martindale, an experimental embryologist at the University of Hawaii, Honolulu. Adds Casey Dunn, an evolutionary biologist at Brown University, “It is now completely clear that genomic complexity was present very early on” in animal evolution.

    Ever since German zoologist Franz Eilhard Schulze first found Trichoplax more than a century ago in a saltwater aquarium, this disc-shaped animal has stirred debate. It has just four apparent types of cells, prompting Schulze and others to consider it a holdover from the earliest animals. They eventually assigned it to its own phylum, Placozoa.

    But not everyone agrees which branch of the animal tree of life is oldest: sponges, comb jellies, or placozoans. And a few researchers have dismissively argued that placozoans are just larvae of cnidarians—jellyfish, sea anemones, and the like—or else a streamlined version of a cnidarian ancestor.

    Rokhsar, his graduate student Mansi Srivastava, and their colleagues sequenced a Trichoplax from the Red Sea, finding an estimated 11,514 protein-coding genes. After comparing the sequences of 104 Trichoplax genes with their counterparts in other organisms, they concluded that placozoans aren't the oldest animals; they branched off after sponges but before cnidaria. Placing Trichoplax on the tree “will now allow us to understand how to interpret its biology in the context of animals as a whole,” says Dunn.

    The sequence is also clarifying what ancient genomes looked like. Trichoplax genes have comparable numbers of introns, noncoding regions interspersed between the coding regions, as vertebrates and as the sea anemone. And many of the same genes were linked on the chromosomes of vertebrates, Trichoplax, and sea anemones, the researchers report. This was not the case with the fruit fly and nematode genomes, whose genes have fewer introns and have moved about quite a bit.

    Despite being developmentally simple—with no organs or many specialized cells—the placozoan has counterparts of the transcription factors that more complex organisms need to make their many body parts and tissues. It also has genes for many of the proteins, such as membrane proteins, needed for specialized cells to coordinate their function. “Many genes viewed as having particular ‘functions’ in bilaterians or mammals turn out to have much deeper evolutionary history than expected, raising questions about why they evolved,” says Douglas Erwin, an evolutionary biologist at the Smithsonian National Museum of Natural History (NMNH) in Washington, D.C.

    Trichoplax could yet be more complex than observed, perhaps having subtle differences in cell types. Or, the amoeboid form may be just one phase of a complex life cycle that's still undiscovered, says Rokhsar.

    The surprises in the Trichoplax genome emphasize the importance of sequencing other early-arising species, such as comb jellies or different kinds of sponges, says evolutionary biologist Allen Collins of the National Marine Fisheries Service and NMNH. “The more taxa we fill in,” says Collins, “the clearer our picture will be for how the entire suite of these molecules evolved over the critical time early in metazoan history.”


    New Regulation Would Lessen Influence of Fish and Wildlife Experts

    1. Erik Stokstad

    For much of the past 35 years, the U.S. Endangered Species Act (ESA) has been at the center of some of the fiercest environmental battles in the United States. It has been the means by which tiny fish have held up big dams, helped bring iconic species such as the bald eagle back from the brink, and pitted environmentalists against loggers over the protection of old-growth forests in the Pacific Northwest. Now, with just 5 months left in office, the Bush Administration has proposed controversial rules that would exempt many projects from what the Administration says are unnecessary reviews by the U.S. Fish and Wildlife Service (FWS) and the National Marine Fisheries Service (NMFS). But the plan has left environmentalists sputtering. “This is a way to allow destructive projects to go forward without the check of Fish and Wildlife Service biologists,” says Defenders of Wildlife's Jamie Rappaport Clark, who headed the service during the Clinton Administration.

    In proposing the rule last week, the departments of the Interior and Commerce listed two main goals. The first is to prevent the ESA from being used to regulate carbon dioxide (CO2) emissions. That became a possibility after FWS listed the polar bear as a threatened species in May (Science, 23 May, p. 1000). The second is to reduce the number of so-called informal consultations, which have delayed many proposed projects for months or years, according to a 2004 report by the U.S. Government Accountability Office.

    Currently, when an agency is considering a proposed action (or giving a permit to a member of the public), it must first determine whether a listed species or its critical habitat might be affected. If so, then it must informally consult with biologists at the relevant service, generally FWS for terrestrial species or NMFS for marine organisms. More often than not, the agency thinks the project is unlikely to cause harm. The FWS or NMFS biologists usually agree, and the agency can proceed. But if agency or service biologists determine that harm is likely, a more involved, formal consultation takes place, and the service decides how the damage could be minimized or avoided.

    Under the proposed rule, agencies have to consult the services only if indirect or direct effects of their actions are an “essential cause” of and “significant contributor” to the likely harm. This would mean that the Department of Transportation, for example, would not have to consider the impact of greenhouse gas emissions from cars on polar bears or any other listed species. The reason? Emissions from any single highway do not make a significant contribution to the melting sea ice that harms the bears, argue officials at the departments of Interior and Commerce.

    Beyond CO2, the proposed changes are sweeping as well. Under the new rule, if the agencies determine that their projects are not likely to harm a species, they would not need to seek an expert opinion from the services at all. (If the agencies suspect harm to a species, however, they still must formally consult.) Officials at the departments of Interior and Commerce argue that agencies are “fully qualified” to decide on their own whether their projects will harm a species or its habitat.


    Under a proposed rule, some logging and other federal activities might not require a review of possible harm to endangered species, such as the marbled murrelet.


    Not so, say experts familiar with the ESA. Holly Doremus of the University of California, Davis, law school cites an analysis by FWS and NFMS, released in January, of similar regulations under the National Fire Plan, a federal program to reduce the risk of forest fires and restore burned ecosystems. Issued in 2003, these regulations allow the U.S. Forest Service and the Bureau of Land Management (BLM) to decide whether to consult about prescribed burning and other fire-related actions. NFMS found that the agencies failed to use the best available science in 10 out of 10 cases; FWS found other flaws in 25 out of 43 cases. “That's a very discouraging piece of evidence about how seriously the action agencies will take this job,” says Doremus. A BLM spokesperson says the agency now has more expertise and expects “improved outcomes” in the future. The Forest Service has also increased training.

    Although it's difficult to know exactly how well the existing system protects species, proponents say the process helps. “It's where most of the protection for endangered species occurs,” says Noah Greenwald of the Center for Biological Diversity, an advocacy group in Tucson, Arizona. Without a requirement to check in with the services, Clark warns, agencies will have an incentive to put their mission—such as flood control or energy projects—ahead of protecting endangered species. According to J. B. Ruhl of Florida State University's College of Law in Tallahassee, the new rules “surely could be manipulated and abused.” U.S. Department of the Interior spokesperson Chris Paolino calls those concerns unfounded. “All the penalties for harming an endangered species remain in effect,” he says. “It's still in an agency's best interest to err on the side of caution.”

    The future of the rule, which is open for public comment until 15 September, is uncertain. A spokesperson for Senator Barack Obama (IL), the presumptive Democratic nominee for president, told the Associated Press that Obama would toss out the rule if he's elected; Senator John McCain's (R-AZ) campaign has not commented. Meanwhile, Congress could prevent any Administration from spending funds to implement it; Barbara Boxer (D-CA), who chairs the Senate Environment and Public Works Committee and opposes the rule, plans to hold a hearing 24 September. And then there are the courts. “Our biggest hope is that the rule will not be finalized,” says Greenwald. “But if it is, it's likely we'll challenge it.”


    Departments Scramble to Find Math Education Faculty

    1. Jeffrey Mervis

    Kathryn Chval didn't have to apply for her position as a tenure-track assistant professor of math education at the University of Missouri, Columbia. And any letters of recommendation were an afterthought. Instead, after earning her Ph.D. in 2001 and joining the National Science Foundation (NSF) in suburban Washington, D.C., as a program manager, Chval simply put out the word in early 2003 that she was interested in a faculty job. She let one of the hottest job markets in academia do the rest.

    Within a few weeks, 10 universities had invited her for a visit; Missouri offered to fly her out the next day. Comparing her experiences at four campuses, Chval liked the “sense of community” at Missouri and felt that the department's interests complemented her strengths in teacher education, curriculum standards, and working with disadvantaged student populations. But her husband, a lawyer, also needed to find a job. No problem: A few days later, the state attorney general's office offered him a spot in its computer crime unit.

    Chval's no-hassle job search isn't unusual for those holding Ph.D.s in mathematics education. And although that may be good news for job-seekers, it's another impediment for universities trying to improve U.S. science and math education. Chval's colleague in the department of teaching, learning, and curriculum within Missouri's school of education, veteran mathematics educator Robert Reys, reports in the June/July Notices of the American Mathematical Society that 60% of 128 tenure-track academic jobs advertised last year in mathematics education went unfilled (see graphic). Previous surveys by Reys show that number has held steady for at least a decade. The reasons for the seller's market include a shortage of people entering the field, a growing demand by universities for their expertise as they become more involved in precollege education, a lack of consensus on how they should be trained, and a surfeit of other professional opportunities.

    For starters, U.S. universities aren't keeping up with the demand. They award an average of 93 doctorates in mathematics education each year—fewer than one-tenth the number of doctorates in mathematics itself. And Reys says that many of the math ed Ph.D.s go to people not actively looking for jobs because they are already working for school districts, state education agencies, test publishers, and the like. In addition, a majority of those hired last year were already at a university, meaning that their departure simply created another vacancy.

    Indeed, Reys found that the most common reason a university position wasn't filled was because the top candidate went elsewhere, part of a bidding war that Chval says has only gotten worse since she came to Missouri. “I continue to get calls,” says Chval, who has built up a productive research agenda and is quietly optimistic about her chances of winning tenure next spring. “I recently received four calls from one institution, including one from the dean explaining how well my research interests fit in with their program.”

    In demand.

    Math educators like Kathryn Chval are a prized commodity for many U.S. universities, which are having a hard time filling advertised positions.


    Department chairs say the second most common reason for not filling a position is the inability to find an “appropriate candidate.” That's code for the ongoing debate over the proper academic and professional qualifications for a faculty member who may be asked to teach math to undergraduates and education and math courses to graduate students, carry out education research, and train future teachers. Mathematics departments may require candidates to hold a Ph.D. in mathematics, meaning serious research in the discipline, while schools of education usually prefer someone who's been in the classroom and knows how the U.S. education system works. It's a rare individual who has both qualities and a rare institution that can strike the right balance. One department chair who responded to Reys's survey describes “… the third year of a failed search” that pitted mathematicians against math educators. Then he adds plaintively, “we may now lose the position to a pure math type.”

    Low salaries are a third reason for the dearth of qualified candidates, according to the survey, which cites a modal starting salary of $45,000 to $50,000. Chval says that she took a 40% cut in pay in moving from NSF, in suburban Virginia, to Columbia, but that the lower cost of living and other benefits of a small college town made the numbers work. Public school teachers with Ph.D.s often price themselves out of a job in higher education, notes Deborah Loewenberg Ball, dean of the school of education at the University of Michigan, Ann Arbor, a former elementary school teacher whose research has focused on improving elementary and middle school math instruction.

    The federal government is doing little to address directly the shortage of math education faculty members. Earlier this decade, NSF funded 16 so-called centers for learning and teaching that aimed to improve instruction from kindergarten through graduate school in the STEM (science, technology, engineering, and mathematics) disciplines. A partner in one of the centers, Missouri doubled the size of its math ed doctoral program, says Reys. But the centers are being phased out, and Reys says he expects the number of graduates—which hit a peak of six this year—to begin dropping in a few years.

    The demographics also work against any quick fixes. Just ask Chval, who was hired by the University of Illinois, Chicago, in 1989 to manage some of its NSF grants and who took 9 years to complete her Ph.D. there, taking one class a semester while raising a family and working full-time. And that's after having taught elementary school for 2 years. “It may have been free tuition, but my time sure wasn't,” she laughs.


    Turbulent Times for Climate Model

    1. Eli Kintisch

    Researchers are running out of time to finish updating an important U.S. climate change model that has been hamstrung by the budget woes of its home institution, the National Center for Atmospheric Research.

    Researchers are running out of time to finish updating an important U.S. climate change model that has been hamstrung by the budget woes of its home institution, the National Center for Atmospheric Research

    Every June, U.S. climate scientists descend upon Breckenridge, Colorado, to kick the tires on the nation's foremost academic global climate model. Some years there is added pressure, as scientists try to tune up the Community Climate System Model (CCSM) for simulations that will feed into the next report of the Intergovernmental Panel on Climate Change (IPCC). This is one of those years, and scientists are more worried than usual.

    The question is whether they can meet a 1 October deadline for completing a critical part of their increasingly complex simulation of the interplay of Earth's atmosphere, oceans, land, and ice. “We're all very nervous,” says atmospheric modeler Philip Rasch, who works remotely for the National Center for Atmospheric Research (NCAR) in nearby Boulder and who oversees the atmospheric component of the model. A big reason for the concern is the condition of the center, which hosts and manages CCSM.

    A towering challenge.

    The majestic peaks outside NCAR's windows contrast with pancake-high budget increases for the center from NSF.


    In 2004, NCAR's then-director, Timothy Killeen, had launched a major restructuring that included expanding the lab, banking on a 2002 congressional promise of a 5-year doubling of the budget of its main funder, the National Science Foundation (NSF). But Congress failed to keep its promise, and NSF's contribution to NCAR, instead of rising by double-digit percentages, has grown by only 2.6% annually in the past 5 years. The resulting belt-tightening has meant pink slips for 55 employees since 2003 (out of a workforce that has averaged 800 since then) and not replacing 77 others who retired or left.

    Those losses have affected CCSM. In the past 2 years, seven accomplished climate scientists among 50 researchers heavily involved in CCSM have left or announced plans to leave NCAR, including Rasch, a 27-year veteran who begins a new job this fall at Pacific Northwest National Laboratory in Richland, Washington. None has been replaced, although six scientists with less experience have been brought on since 2006 to bolster the effort.

    Some climate scientists say that CCSM should have been better protected from the budget turmoil. “This hub of the nation's climate strategy has apparently not received the priority it deserves and needs,” wrote members of the model's independent scientific advisory board on 8 July in an unsolicited letter to Eric Barron, who last month succeeded Killeen. Although computers are critical for climate simulation, they say, in the end it's NCAR's staff who must incorporate thousands of complex elements into a code that simulates everything from hurricanes to droughts to ocean currents.

    Any erosion of CCSM's projected capabilities threatens what modeler David Randall of Colorado State University in Fort Collins calls “the closest thing we have to a national model.” What sets CCSM apart from rival U.S. models at NASA and the National Oceanic and Atmospheric Administration is its widespread use by academic researchers, who also build it in partnership with NCAR. So whereas the other models rely on the expertise of teams of federal experts, CCSM's health reflects the state of overall U.S. climate research.

    Although IPCC won't issue its next report until 2013, it has asked for data in 2011 from roughly two dozen models scattered around the world. Working backward, CCSM scientists gave themselves the October deadline to finalize the atmosphere, the central element of the million-line code, as well as the other segments. NCAR's deadline for connecting the pieces is 1 January 2009.

    The changes will fix some flaws in the previous version of CCSM and add new features. In particular, scientists want to make tropical temperature patterns more realistic, depict ice sheets, clouds, and cycles such as El Niño more accurately, and better simulate the turbulent movement of air between the ground and an altitude of 1 km. IPCC scientists would also like models to incorporate an active carbon cycle that simulates how Earth's biological life—say, algae or swamps—shapes the biosphere.

    Will NCAR come through? “CCSM is in danger of not being able to make scientifically credible contributions to [IPCC] and the climate science community,” the board wrote in its July letter. Barron disagrees, saying CCSM will remain one of the world's top models. But he acknowledges that fewer bodies will mean “not being able to do as much.”

    Gambling on growth

    Nestled at the foot of the Rocky Mountains in Boulder, NCAR was established in 1960 with NSF funding to advance climate and weather science. Its researchers have developed some of the world's best tools for predicting storms, droughts, and rising global temperatures, built on work by meteorologists, physicists, and modelers alike. In addition, NCAR provides planes, balloons, and computers to academic scientists across the country and around the world. NSF still supplies about three-fifths of its budget, $88 million of a total $149 million, including money for operations, buildings and equipment, and major facilities. The rest comes from competitive grants awarded by other government agencies and industry.

    The precursor to CCSM was developed in 1983 at NCAR, and the model remains a unique partnership between academic and government scientists. Its paleoclimate runs and future projections have been the basis for hundreds of studies referenced by IPCC. The model is run out of the Earth and Sun Systems Laboratory (ESSL), the largest of five so-called laboratories at NCAR.

    Along with an equally renowned short-term weather model, CCSM helped establish NCAR's reputation as the go-to resource for academic atmospheric research. But Killeen, who became director in 2000, wanted NCAR to do more, including increasingly detailed forecasts of the impacts of climate change and interdisciplinary studies on weather. So in 2004, he regrouped existing divisions into ESSL and labs for Earth observations, computing, and airborne-weather projects. He created a fifth lab to respond to growing interest in the societal impacts of climate change. Within the labs, Killeen also set up institutes devoted to interdisciplinary work and to the application of modeling and mathematical methods in geoscience. “It was a bold new initiative,” says Richard Anthes, president of the University Corporation for Atmospheric Research in Boulder, which operates NCAR for NSF.

    Doing more with less.

    The next IPCC report poses a bigger challenge for NCAR's atmospheric modelers, who have fewer senior scientists (names in bold) on hand.


    Killeen hoped that NSF would finance the expansion as its overall budget grew. NSF's contribution to NCAR did rise by 19% between 2001 and 2004, but in the past 5 years it has increased by only 10%. That below-inflationary rise has triggered “chronic wasting disease” at NCAR, says Anthes. It also spawned fears among some scientists about the cost of the new bureaucracy. Managers estimate that the reorganization has added $5 million in staffing and other administrative costs over 4 years. “Shouldn't we really be about putting the money instead into scientific programs?” NCAR veteran scientist Peter Gilman recalls asking an assembly at NCAR in 2004. But without a growing contribution from NSF, Killeen was forced to ask NCAR managers to tighten their belts, including dipping into research funds to meet other expenses.

    The modeling effort has also been affected by a program that Killeen began in 2001 that pays half the salaries of new hires for 2 years. ESSL used the optional funds to hire four young scientists in the climate division between 2001 and 2007, says William Large, who heads ESSL's climate division, and managers also hired five young scientists through the normal mechanism. But without a rising budget, the lab couldn't afford to replace senior modeling polymath Byron Boville, who died in 2006, or Jeffrey Kiehl, who moved from atmospheric to paleoclimate studies within ESSL. Kiehl called the hirings “a gamble” that ESSL lost.

    Other parts of NCAR have also suffered from the budget shortfall. A fledgling extrasolar planets program was shuttered in 2004, and a light-detection-and-ranging station that measures aerosols will be closed next year. Before he left in June, Killeen also dissolved the lab for societal impacts that he created. (Beyond a short interview conducted before he left NCAR, Killeen declined comment for this article, citing conflict-of-interest rules related to his new job as director of NSF's geosciences division, which funds NCAR.)

    The pruning has continued under Barron. This month, he ended a program run by a senior political scientist that conducted public policy research on the impacts of climate change on developing countries. The decision prompted an outcry from social scientists. But Barron says that he had no choice and that cutting it, along with the societal-impacts lab, will save NCAR $2 million annually. “We have not hurt CCSM nearly as much as other parts of NCAR,” says deputy lab director C. Larrabee “Larry” Winter.

    Getting the picture.

    An early version of NCAR's updated global climate model (lower right) does a better job of simulating actual ocean temperatures during an El Niño event (top) than an earlier model (lower left).


    The main impact of the budget squeeze on climate modeling has been on the workload of scientists. Pressured to coordinate an increasingly complex model with fewer colleagues, Rasch says that he and others couldn't explore “the ideas they found fascinating.” He says that's a big reason he left. James Hack decamped for the Department of Energy's Oak Ridge National Laboratory in Tennessee because, he says, “I had a better opportunity to build a program [there] than I did at NCAR.” Joining the recent exodus of atmospheric modelers is William Collins, who went to Lawrence Berkeley National Laboratory in California. “Each in their own way found that something else was better,” says Gilman. In addition, four scientists who worked on CCSM's land and chemistry components have left since last year and have not been replaced.

    Modeler David Bader of Lawrence Livermore National Laboratory in California says a few million dollars spent over the past 5 years on personnel could “have made a big difference” in attracting and retaining seasoned talent. Last year, NCAR brass moved roughly $1.5 million—culled from reshuffled NCAR funds and a small NSF boost—to support the research group that does climate modeling. But that just paid the salaries of software engineers moved into that group from within NCAR. The lab is trying to hire a senior and junior atmospheric modeler.

    Despite the negligible growth of NSF's contribution, NCAR has spent roughly $5 million since 2004 on equipment, including a balloon system, a mobile radar facility, and outfitting a high-altitude research jet that was completed in 2006. “It simply doesn't make sense to have a $100 million plane sitting in the hangar not doing science,” says Anthes, explaining why the money wasn't spent on science and modeling efforts.

    Down to the wire

    Not all the news at Breckenridge was bad. Scientists there applauded, for example, a much-improved simulation of the seasonal global climate phenomenon known as El Niño-Southern Oscillation (above). The previous version depicted an unrealistic 2-year El Niño cycle; the new version offers a more realistic cycle of 3 to 6 years. Along with promising new land, ice dynamics, and depictions of the North Atlantic, CCSM's overhauled atmosphere includes new physics to describe the way clouds move heat and shift winds. “It'll be a much better model,” Killeen says, thanks in part to the contributions from young researchers hired under his program.

    But many of the proposed additions to the model have yet to be fully tested, making scientists uneasy. One important improvement that's behind schedule, says modeler Richard Rood of the University of Michigan, Ann Arbor, a CCSM advisory board member, is a better depiction of the turbulent movement of air from the ground to an altitude of 1 km. “I would personally worry about the fact that they're still doing major tuning [on that],” he says. Another important change would add biogeochemistry, including the complex relationship between the carbon and nitrogen cycles. Attempts to simulate dynamic nutrient cycles can lead to big crashes during testing, such as when forests in the model die unexpectedly. Progress has been slow, says Scott Doney of Woods Hole Oceanographic Institution in Massachusetts, a member of the model's biogeochemistry team, and the recent departure of two NCAR experts—Natalie Mahowald and Peter Thornton—hasn't helped. “We're going to be lucky to get a stable climate biogeochemistry system,” he says.

    Knitting together the model's many promising additions by January poses an even greater challenge. Researchers would like to simulate local clouds better. But when scientists inserted new parameters to do that into the full working atmosphere, polar clouds blocked too much sunlight and created excessive sea ice. The team expects to work the physics into the full atmosphere, says Rasch, but it's unclear whether the feature will work when coupled to the ocean.

    The loss of seasoned modelers will be especially noticeable during the coming integration phase, says Rood. NCAR's young modelers are talented, he says, but lack valuable experience taming new parts of an unpredictable code. But NCAR atmospheric modeler Andrew Gettelman, who joined the CCSM team in 2003, says the departed veterans “are all there when we need them,” reachable by e-mail or phone. And the 38-year-old modeler says the fact that he wrote some of the overhauled code will help him make the pieces fit.

    Atmospheric scientists are cautiously optimistic that Barron, a paleoclimate modeler, recognizes the importance of CCSM. “There's no one else I would rather have at the helm,” says atmospheric scientist Marvin Geller of Stony Brook University in New York, citing Barron's leadership of several national climate-related panels. Barron says his recent moves show that he's serious about protecting “core” activities such as CCSM, noting that he used an early version of the model for his 1980 Ph.D. thesis at the University of Miami.

    Despite proposals by the White House and lawmakers to give NSF a double-digit increase in 2009, Anthes thinks that political gridlock could leave NCAR with another flat budget. So he demurs when asked if he sees light at the end of the tunnel. “I see hope,” he muses. “It may be moonlight.” For all his confidence about the next version of CCSM, Barron says continued flat budgets could devastate his lab's modeling efforts. “That's the real threat,” he says.


    Shielding a Buddhist Shrine From the Howling Desert Sands

    1. Richard Stone

    China is embarking on a major new effort to protect the Mogao Grottoes, a unique repository of murals and sculptures on the old Silk Road.

    China is embarking on a major new effort to protect the Mogao Grottoes, a unique repository of murals and sculptures on the old Silk Road

    DUNHUANG, CHINA—The medieval Chinese merchant prayed for good fortune before his men and their goods-laden camels and donkeys set off on a long westward journey. But the caravan soon ran into trouble: A camel tumbled off a cliff to its death, and thieves made off with reams of silk. These travails spring to life from a vivid mural, dated to the turn of the 7th century C.E., that's painted on the clay wall of Cave 420 of the Mogao Grottoes.

    One of Buddhism's most revered shrines, the 492 decorated caves that make up Mogao were hewn from a cliff between the 4th and 14th centuries C.E. 25 kilometers southeast of Dunhuang, an ancient Silk Road way station in western China. Most caves feature Buddha and disciples, pre-Buddhist deities such as the nine-headed god of Earth and the 13-headed god of heaven, and scenes from sacred texts and legends. The murals—and a trove of 40,000 documents discovered in one of the caves a century ago—also offer a singular glimpse into the Silk Road's ancient rhythms, from elaborate wedding festivals and treasure hunting to sporting events with archers on horseback. The “Library Cave” cache “transformed the study of the history of China and the Silk Road,” says Susan Whitfield, director of the International Dunhuang Project at the British Library in London. Highlights include the earliest known manuscript star map and the earliest dated printed document, a Buddhist sutra copied in May 868 C.E.

    In designating Mogao a World Heritage Site in 1987, UNESCO hailed it as a “unique artistic achievement” whose 45,000 square meters of murals include “many masterpieces of Chinese art.” “No other place compares to Mogao,” says Qu Jianjun, a geographer at the Cold and Arid Regions Environment and Engineering Research Institute (CAREERI) in Lanzhou.

    That's why China is launching a new effort to safeguard the caves. Earlier this year, its National Development and Reform Commission approved a $38 million project to protect Mogao's fragile artworks from three major threats: salt leaching from groundwater, exhalations and body heat from droves of tourists, and shifting sand dunes. Later this year, CAREERI plans to open an environmental research center in Dunhuang, dedicated in large part to Mogao and directed by Qu.

    Cave art.

    On one of the Silk Road routes (dotted lines) in China lie the Mogao caves and their images of Buddhist scriptures, such as this recently restored mural.


    As spectacular as the grottoes remain, damage over the centuries has diminished their grandeur. Some harm is human-wrought: After the shrines fell into disuse some 600 years ago, looters made off with scores of statues. (Some 2400 are left.) Further insults were inflicted early last century, when a retreating Russian White Army battalion holed up in the caves, blackening paintings with smoke from cooking fires and marring others with graffiti. Nature has taken a toll too. The five tiers of caves were carved along the west bank of the glacier-fed Daquan River. Spring meltwater surges occasionally flooded ground-level caves, destroying portions of murals, before engineers built embankments in the 1960s.

    Then there is the constant threat of sand. “Without proper efforts, the grottoes will be entirely buried by shifting sand,” says Qu. Dunhuang is an oasis on the edge of the Kumtag Desert, and searing winds whip down on Mogao from Mingsha Mountain, a 170-meter-high “megadune” of sand a few kilometers west that is creeping toward the grottoes. By the 1920s, sand had clogged the caves and they lay neglected for 40 years. “They were nearly lost,” says Qu. To prevent a repeat, in 1989 the Getty Conservation Institute (GCI) of the J. Paul Getty Trust in Los Angeles, working with Dunhuang Academy and CAREERI, erected a 4.7-kilometer nylon windbreak fence on the cliff edge above the caves. That has cut wind-driven sand by about 60%, says GCI principal project specialist Neville Agnew.

    To further blunt the advancing dunes, GCI and CAREERI teamed up in the mid-1990s to plant a vegetation barrier between Mogao and Mingsha. Qu and colleagues are now testing how gravel mulch—rocks of various sizes placed in grids on the sandy plateau above the caves—impedes sand flow.

    These and other measures have proven “highly effective,” says Agnew. There's less erosion of the cliff edge, he says, and less sand cascading over it and into the caves. GCI has since turned its attention to developing techniques to preserve Mogao's artwork and to site planning and management.

    Most caves, including those with some of the finest art, are off-limits. Under China's new initiative, the Institute of Computing Technology in Beijing is creating a virtual museum at Mogao's visitor center to allow tourists to experience closed caves. The initiative will also bolster efforts to mitigate sand damage and wind erosion. “The biggest challenge will be to stabilize Mingsha Mountain,” says Qu. The fate of one of the more vibrant vestiges of the old Silk Road may depend on it.


    Can High-Speed Tests Sort Out Which Nanomaterials Are Safe?

    1. Robert F. Service

    A flood of strange new substances based on ultrasmall particles is forcing researchers to reinvent toxicology.

    A flood of strange new substances based on ultrasmall particles is forcing researchers to reinvent toxicology

    Bright invaders.

    Nanosized quantum dots pervade cells, tinting each structure a different color.


    If you've ever marveled at a painter mixing colors on a palette, you have a taste of what nanotechnologists feel when designing materials at the smallest scale. Nanoscientists today mix collections of different atoms to create a multitude of novel pint-sized particles with a vast array of new electronic, optical, catalytic, and chemical properties. Iron oxide particles, for example, are proving exceptional as contrast agents for medical imaging. Bits of titanium dioxide harvest sunlight in solar cells. And on and on. “These are materials with wonderful properties,” says Fanqing Chen, a biomedical scientist at Lawrence Berkeley National Laboratory in California. “Their contribution to science, technology, and the economy is going to be huge.”

    But the remarkable diversity of nanomaterials is also one of the field's biggest headaches. With potentially thousands of novel materials under investigation, health and safety regulators are left scratching their heads over which are safe and which are potentially toxic to humans or other species. That uncertainty threatens to undermine public confidence in nanotechnology and stymie the development of what is expected to be a major global economic engine. What's worse, traditional toxicology studies that test one material at a time appear woefully inadequate to the task of sorting the dangerous from the benign. “If we continue with the classical toxicological assessment [for nanomaterials], we will never catch up with this work,” says Andre Nel, a nanomaterials toxicologist at the University of California, Los Angeles.

    Benign or harmful?

    Rat lung cell attempts to ingest a carbon nanotube. Clumping nanotubes can stymie the animals' immune systems.


    In hopes of picking up the pace, Nel and a handful of other toxicologists around the world are creating batteries of high-speed assays for testing the toxicity of hundreds and even thousands of different nanomaterials at the same time. The assays involve exposing different types of cells to nanoparticles and monitoring the cells for changes, such as DNA damage, alterations in gene expression, cell death, and impacts on cell growth and proliferation. It's still too early to say whether these hopes will pan out. But at a minimum, nanotoxicologists hope high-throughput studies will help them prioritize nanomaterials that should undergo conventional toxicology tests.

    Clearly, there is cause for concern about the toxicity of some nanomaterials. Early studies with soccer-ball-shaped carbon fullerenes revealed, for example, that the particles can damage brain cells in fish. Conventional toxicology studies also raised yellow flags about the straw-shaped cousins of fullerenes called carbon nanotubes. When the lung tissue of rats was flushed with high levels of nanotubes, researchers found that the tubes clumped together, making it difficult for the animals' immune systems to clear them from their bodies. Still other studies have raised concerns about particles mostly considered harmless at larger sizes, such as titanium dioxide nanoparticles.

    Unfortunately, such studies are slow and expensive. A single in vivo animal study can cost $50,000. In addition, it's often difficult to extrapolate how a material tested in one tissue will affect others. It's even harder to extend that one study to determine how other nanomaterials will behave. The study that showed fullerenes can damage the brain cells of fish, for example, says little about exposure to particles of iron oxide, silicon dioxide, or carbon nanotubes.

    To confound matters further, particles can also come with different coatings that alter their properties, and they can even acquire new coatings upon entering the environment or the human body. Chen notes that some of his team's early studies reveal that the toxicity of nanoparticles can change even when they change size or shape but not composition. “They show different effects, even if they are the same material,” Chen says. “It looks like there is an additional dimension … when you are dealing with nanomaterials.”

    Faced with such complexity, researchers have begun looking to high-throughput testing techniques as a route to better, faster, and cheaper toxicology studies. Drug companies have long used similar methods to test compounds for curing and preventing disease. But because toxicologists must look for not just one positive outcome but potentially dozens of negative ones, they face a much tougher task: developing a battery of in vitro assays using a variety of cell types to assemble a fingerprint of ways a particular nanoparticle affects different tissues.

    One of the biggest such efforts to date is ToxCast, a 5-year initiative by researchers with the U.S. Environmental Protection Agency in Research Triangle Park, North Carolina. Launched in 2007 to speed up toxicity testing on chemicals in general, ToxCast is nearing the end of its pilot phase. Researchers are examining 320 different chemicals, such as pesticides, that all have previously undergone extensive conventional toxicological screens, says Keith Houck, an EPA toxicologist who is helping to develop the program. Houck says he and his colleagues are testing each chemical in a wide variety of cell-based assays, looking for 400 different “endpoints” that could signal danger to living organisms.

    “What we are trying to do is get a broad bioactivity profile for each compound and correlate those with toxicology studies we have,” Houck says. In the program's second phase, set to begin in 2009, Houck says ToxCast researchers will evaluate probably about another 1000 compounds, which will likely include several varieties of nanomaterials. Houck notes that the number of different starting nanomaterials, coupled with possible coatings and other molecules that could interact with them, makes it impossible to test every combination. “But we can use computational approaches to hopefully test the right ones and make some predictions about which materials are of most concern.” The compounds determined to be the riskiest will then be given top priority for conventional toxicity screening.

    Although the results from ToxCast won't be known for years, smaller scale efforts are already giving a sense of what the approach can offer. In a paper published in 2006 in Nano Letters, for example, Nel and his colleagues compared the effects of ambient ultrafine particles such as carbon black—produced in diesel exhaust—with the effects of manufactured nanoparticles such as chemically functionalized fullerenes and polystyrene. Using a series of assays that monitored levels of oxidative stress and cytotoxicity, they found clear differences in cellular responses to different particles. Functionalized polystyrene particles topped the list of materials causing the most oxidative damage to mouse macrophage cells.

    “In nature, there are only a dozen or so major pathways that lead to 90% of toxicological outcomes,” Nel says. So Nel and his colleagues are currently working to set up a broader set of screens to systematically monitor these pathways, conducting up to about 1000 cell-based assays a day. The screens, he says, will look for changes in addition to oxidative stress and cytotoxicity, such as DNA damage and changes in cell growth and proliferation. As ToxCast is doing, he then intends to gauge which nanomaterials are the likeliest to be harmful and put them first in line for more rigorous tests. “Instead of going through the haystack handful by handful, we've come up with a procedure to get rid of the hay,” Nel says.

    In demand.

    According to the Woodrow Wilson International Center for Scholars, the number of consumer products containing nanoparticles nearly doubled in 14 months.


    Chen has been developing a related approach. In another 2006 Nano Letters paper, he and colleagues looked at the genetic response of human skin fibroblasts when exposed to different nanoparticle-based imaging compounds. For their tests, Chen's team used gene chips that tracked how gene expression changed in about 22,000 genes in response to different types and doses of nanoparticles. Only about 50 of the genes showed significant changes. From the pattern, the Berkeley team concluded that—unlike several of the naked nanoparticles—particles coated with a polymer called polyethylene glycol induced minimal impact on exposed cells.

    In a more recent study, Ralph Weissleder, a molecular imaging expert at Harvard Medical School in Boston, and colleagues have also turned to high-throughput analysis to sort out which nanoparticles have the best shot at succeeding as medicines. Nanoparticles are currently being developed as imaging agents, drug carriers, and, in some cases, therapeutic drugs themselves. A multitude of such compounds are under development, many of them coated with different small molecules to help target them to different tissues in the body, Weissleder says. But drugmakers want to avoid sinking too much time and effort into nanoparticles that have no shot at being approved for in vivo use because of their toxicity.

    So Weissleder and his colleagues set out to flag potential troublemakers. They evaluated 50 different nanomaterials at four concentrations on four cell types with four different assays. In the 27 May issue of the Proceedings of the National Academy of Sciences, they reported that they had found broad, consistent patterns of activity, although different cell types behaved very differently. “There is not a single test that will predict how nanomaterials will behave in vivo,” Weissleder says. But the batteries of cell assays can help researchers decide which ones are likely to be safest for human studies, he says.

    Taken together, the early studies suggest that high-throughput assays could vastly speed up toxicological screening, Weissleder and others say. They can't do everything, Houck cautions: “In vitro assays won't ever replicate a whole body.” Still, if they can begin to sort out which types of nanoparticles pose the biggest risk, they could encourage targeted testing of those compounds and thereby help shore up faith in nanotechnology's future.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article