News this Week

Science  08 Jan 2010:
Vol. 327, Issue 5962, pp. 130

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Public Health

    U.S. Panel Favors Wider Use Of Preventive Drug Treatment

    1. Jennifer Couzin-Frankel

    If regulators agree, the pool of people in the United States eligible for cholesterol-lowering drugs could soon expand dramatically to include as many as 6 million people whose cholesterol levels fall within a normal range. The potential boon to drugmakers and preventive health care came from an advisory committee to the U.S. Food and Drug Administration (FDA). The panel last month endorsed a wider use for Crestor, a statin manufactured by AstraZeneca. The decision, which could affect other statins too, raises tough questions: Who should get potentially risky medications to cut the chance of a deadly disease? And how many healthy people is it reasonable to treat to avoid one heart attack?

    Statins, which have been around since 1987, bring down cholesterol and lower the risk of heart attack and stroke. They are already taken by millions with no overt disease but with high LDL cholesterol. The definition of “high LDL” has been trending downward in recent years, however, and scientists are considering new groups for whom the benefits of statins might outweigh the risks.


    Heart attack is something statins can prevent.


    A clinical trial highlighted one of these groups in late 2008. Called JUPITER, it enrolled 17,800 middle-aged men and women whose LDL cholesterol levels were healthy by current standards but who had high blood levels of a marker for inflammation called c-reactive protein (CRP). Benefits appeared quickly in a very modest percentage of those treated: After less than 2 years on Crestor, 142 people, or 1.6%, who got the drug had had a cardiac event, such as a heart attack, stroke, or hospitalization for angina, compared with 251, or 2.8%, who received a placebo (Science, 14 November 2008, p. 1039). Although the number directly helped was small, last month an FDA advisory committee voted 12 to 4 to offer Crestor to anyone who fits the JUPITER risk profile.

    Extrapolating JUPITER's results to a wider population could be tricky. Like other statins, Crestor can have side effects, including muscle weakness and liver toxicity. (It's also expensive at more than $3 a day.) And although JUPITER's results could prompt a wider use of all statins, no one knows whether the benefits conferred in the JUPITER trial would translate to other cholesterol drugs.

    In prevention circles, physicians often talk of the “number needed to treat” (NNT): how many people must receive a preventive therapy for one case to be avoided. The NNT in the JUPITER trial matches up favorably with NNTs in other cohorts treated to prevent cardiovascular disease. JUPITER's researchers, led by cardiologist Paul Ridker of Brigham and Women's Hospital in Boston, estimated that treating 29 people for 5 years would prevent one cardiovascular event. That's somewhat more impressive than the NNT for people with slightly elevated cholesterol who already take statins and, Ridker points out, for those who take blood pressure drugs to prevent heart problems. Furthermore, any NNT drops over time: As a cohort sticks with statins for many years, the number of heart attacks prevented climbs (as does the number of people coping with drug side effects).

    On the rise.

    18 million U.S. residents use statins, including Crestor.


    But attitudes are changing with the push for personalized medicine; many say that NNTs now deemed acceptable are too high, with too many people taking drugs who won't be helped by them directly. “We should be able to do much better” at predicting who's most likely to fall ill and treating that narrower cohort, says Eric Topol, a cardiologist and director of the Scripps Translational Science Institute in San Diego, California.

    CRP, for example, is a crude measure that's not tightly linked to inflammation of the arteries: It can also rise in someone with gingivitis or an inflamed joint. Cardiologists hope to find more refined markers. One possibility was reported on 24 December in a paper in The New England Journal of Medicine, in which researchers discussed two new genetic variants associated with the lipoprotein Lp(a), which appear to substantially raise the risk of coronary disease.

    Whether guided by CRP or something else, proposals for widespread statin use are intensely debated these days. Some see the benefits as undeniable. Preventive cardiologist Daniel Rader of the University of Pennsylvania points out that even average-risk individuals in middle age, looking ahead 30 years, have a relatively high chance of heart disease—as much as 30% or even higher. It's not pertinent to focus solely on NNT, he argues, which is “more of an economic issue” than a health consideration. Although “I don't think I'm quite ready to say that when you turn a certain age, start your statin,” says Rader, he did just that himself when he hit 50 last year. But given that statins have side effects, many cardiologists are leery of modifying guidelines without hard evidence.

    Another reason for caution is that some past efforts to expand preventive care have not stood the test of time. For example, says Steven Nissen, chair of cardiovascular medicine at the Cleveland Clinic in Ohio, “millions of Americans [were] told to take an aspirin a day” to prevent heart attacks. But several years ago, an FDA advisory panel on which Nissen sat “overwhelmingly” agreed that for those at low risk, the hazards of aspirin, such as bleeding, outweigh its benefits. “A therapy became established in very low-risk individuals when, on further reflection, there's evidence it's the wrong thing to do,” Nissen says. Some cardiologists don't want to chance a repeat with the statins, and because of that possibility, stress that any extension of statin use be considered only for people like those in JUPITER, with high CRP.

    If history is any guide, approving Crestor for a much wider audience could result in many takers. Statins are already enormously popular, and physicians working in prevention in other fields have been intrigued by the number of people who willingly take them for years. In breast and prostate cancer, on the other hand, for which drugs exist that can cut 5-year risk of those cancers by as much as half, relatively few opt for them. “We apply a different standard when it comes to cancer risk reduction” than when slashing cardiovascular risks, says Victor Vogel, national vice president for research at the American Cancer Society in Atlanta. In cancer, “there was a lot of criticism that drugs used for prevention have to be absolutely safe,” a standard Vogel considers unrealistic—and one that doesn't apply to statins, either.

    FDA hasn't made its final determination yet but usually follows its advisory committees' recommendations. It's expected to rule later this year.

  2. Japan

    2010 Science Budget Not Apocalyptic, as Feared

    1. Dennis Normile

    TOKYO—For weeks, Japan's scientific community agonized over spending cuts recommended by a government task force. Now researchers are breathing a sigh of relief: Although some projects will absorb big hits, the new administration's first budget, approved by the Cabinet at the end of December, calls for relatively minor changes in science priorities.

    Figures for total S&T expenditures won't be known until bureaucrats comb through individual ministry and agency budgets. But “the overall total has probably not decreased,” says Koichi Kitazawa, president of the Japan Science and Technology Agency, which administers government grants. The budget now goes to the legislature, which is expected to make few changes, and will take effect on 1 April.

    A handful of high-profile projects will suffer in 2010. Among the losers is the $1.3 billion Next-Generation Supercomputer project, slated for completion in 2012. The previous administration had earmarked $290 million for the effort; the new government will allot $230 million. “We had been hoping to complete the system early, but those plans have to be pushed back,” says Tadashi Watanabe, who heads the project for RIKEN, a network of national labs headquartered near Tokyo. The SPring-8 synchrotron near Kobe will have to get by on 2% less in fiscal 2010, or about $93 million. “It's not a big cut, but it will have some effect,” says Koki Sorimachi, head of planning for the RIKEN Harima Institute, which operates SPring-8.

    Meanwhile, the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) is facing a 6% cut to its annual budget of roughly $400 million. Because of high fixed costs, “reductions will come out of pure research-related money,” says Asahiko Taira, an executive director of JAMSTEC. His organization aims to minimize the impact on the drill ship Chikyu, Japan's contribution to the International Ocean Drilling Program.

    Increased support for emerging fields should compensate for areas being cut, says Kitazawa. Gaining ground, for example, are green technology programs; money for initiatives under the education ministry will nearly triple this year to $107 million.

    Staying afloat.

    JAMSTEC hopes to keep the drill ship Chikyu operating despite a 6% budget cut.


    Funding for universities and academic research is essentially flat. The budget for grants-in-aid for scientific research, the bread-and-butter support for university researchers, will stay at $2.1 billion, matching a public plea last month by a group of university research officers. “More would have been better, but at least it's in line with our request,” says Masafumi Akahira, a vice president of the University of Tsukuba. But base funding for national universities is being squeezed 1% to $12.5 billion.

    At one point, scientists feared much worse. When the Democratic Party took power last August, it announced it would rewrite the rules for preparing budgets, starting with the fiscal 2010 budget proposed by the long-ruling Liberal Democratic Party just 2 days before its historic electoral defeat. In November, a task force set up to identify fat in the budget recommended freezing spending on the supercomputer, pending a review, and deep cuts to JAMSTEC and SPring-8 (Science, 20 November 2009, p. 1046). The ensuing storm of protests from scientists—including most of Japan's living Nobel laureates—got much of the funding restored.

    Managers are already worrying about next year. “We're really afraid this trend may continue. If the economy continues downward, we could face more budget cuts,” says Taira. In the past, bureaucrats worked out funding details for each program and institution and then totaled the numbers for each budget category. This time, the Cabinet set funding for broad categories. “If scientists can't explain their work to policymakers, they are going to see their budgets go down,” warns Kitazawa. With the new party bypassing the bureaucrats, Taira says scientists need to find new ways to influence policy. “Exactly how we're going to do that, we don't know yet,” he says.

  3. China

    After Long March, Scientists Create 'Chinese NIH'

    1. Li Jiao*

    BEIJING—Scientists here rang in the New Year with the debut of China's first biomedical research fund. Last week, the National Natural Science Foundation of China (NSFC) launched a medical department that expects to disburse about 1 billion renminbi ($150 million) in government grants in 2010.

    The department should be a shot in the arm for unraveling disease mechanisms, modernizing traditional Chinese medicine, and moving results from bench to bedside. “It will promote a speedy transition of basic research into clinical application,” says Pei Duanqing, director general of the Guangzhou Institute of Biomedicine and Health of the Chinese Academy of Sciences.

    For backers of basic biomedical research, the new department is a decisive victory in a decade-long ideological struggle. In 2001, when NSFC first declared its intention to create a medical department, “some people believed that there was no basic research in medical science,” says NSFC President Chen Yiyu. That unfavorable climate compelled many scientists to work abroad. In the early 1990s, says Ma Yue, a “poor atmosphere” and a shortage of grants made it “hard to do medical research.” Ma left for the United States in 1994 and returned here in 2006 to conduct stem cell research at the Institute of Biophysics of the Chinese Academy of Sciences.

    The prevailing winds shifted in 2008, when hematologist Chen Zhu was appointed health minister. He has campaigned vigorously for creation of an agency akin to the U.S. National Institutes of Health (NIH) (Science, 28 March 2008, p. 1748). Although Chen Zhu has not forsaken that goal, he threw his weight behind NSFC's effort. The health minister was “instrumental” in helping to get the medical department off the ground, says Chen Yiyu.

    Unlike NIH, NSFC's medical department will not have an intramural research program. Nevertheless, says Stephen Roper, a biophysicist at the University of Miami in Florida, “the target of NSFC and NIH is the same: apply basic research to solving ongoing human disease problems.”

    Chen Yiyu has tapped Wang Hong-Yang, an expert on hepatitis-induced liver cancer, as the medical department's first director. Wang, director of the International Cooperation Laboratory on Signal Transduction at the Second Military Medical University in Shanghai, will spend a third of her time here overseeing the new department. “My job is to clarify the research directions and make sure the best medical scientists get funded,” she says.

    That's music to the ears of Huang Liquan of the Monell Chemical Senses Center in Philadelphia, Pennsylvania. The medical department's initial budget “is an excellent start,” says Huang, who believes the new entity will usher in a much wider range of opportunities for cooperation between Chinese and U.S. scientists on basic biomedical research.

    • * Li Jiao is a writer in Beijing. With reporting by Richard Stone.

  4. Ecology

    Europe's Bats Resist Fungal Scourge of North America

    1. Erik Stokstad

    The same fungus that has devastated bat colonies in the northeastern United States has been identified for the first time in Europe—in a healthy bat. “The astonishing thing is that [the fungus] affects North American bats so devastatingly, but that European bats can get along with it,” says Christian Voigt, a bat physiologist at the Leibniz Institute for Zoo and Wildlife Research (IZW) in Berlin.

    White-nose syndrome was first identified in a cave in upstate New York in 2006. Since then, it has spread across nine states and caused unprecedented mortalities. Affected bats emerge from hibernation too frequently and lose body fat, and many starve to death. Last year, a group led by microbiologist David Blehert of the U.S. Geological Survey in Madison, identified the fungus associated with the syndrome as Geomyces destructans, but many puzzles remain about the nature of the disease, such as whether the bats' immune systems were compromised (Science, 29 May 2009, p. 1134).


    This French bat was not killed by fungus on its nose (arrow).


    European researchers watched the U.S. outbreak with alarm. “I thought, ‘Oh my God, we've got a huge nightmare on our hands,’” recalls Kate Jones of the Zoological Society of London. So far, no mass casualties have been detected among Europe's species, but researchers did find anecdotal reports of bats with white fungus that no one had paid attention to previously.

    On 12 March, Sébastien Puechmaille of University College Dublin (UCD) spotted a mouse-eared bat (Myotis myotis) covered with fungus in a cave 130 kilometers northeast of Bordeaux, France. Microscopic examination of the spores and two molecular markers showed that it was G. destructans, the team reported online 29 December in Emerging Infectious Diseases. Another group, led by Gudrun Wibbelt of IZW, has also identified the fungus in bats from three other European countries, none reporting bat deaths. Their results have been submitted to the same journal.

    Now the challenge is to figure out why most European bats are not infected and why those that are remain healthy—and whether that knowledge can be used to help ailing bat populations in the United States. One scenario is that G. destructans has been present in Europe for a long time, and European bat species have evolved immunity, says Emma Teeling of UCD, the senior author of the December paper. Or perhaps the fungus evolved greater virulence after arriving in North America, a possibility that could be investigated with further sequencing.

    Whatever the explanation, the European reports are “great news,” says Alan Hicks, a mammal specialist with New York's Department of Environmental Conservation in Albany, who has charted the decline of the state's once-massive bat colonies. Eventually, an understanding of these differences could help lead to the development of a vaccine or treatments for endangered bats, Blehert says. Meanwhile, researchers are beginning once again to survey hibernating bats in the Northeast United States. Hicks says the signs so far are that deaths are continuing.

  5. India

    Fatal Fire and Tritium Poisoning Leave Nuclear Labs Searching for Answers

    1. Pallava Bagla

    THIRUVANANTHAPURAM, INDIA—A pair of mishaps has left India's nuclear establishment on edge. On 28 December, two biochemistry Ph.D. students burned to death in a mysterious fire in the country's main nuclear laboratory, the Bhabha Atomic Research Centre (BARC) in Mumbai. A month earlier, dozens of workers at a nuclear plant in Kaiga were exposed to tritium in an apparent case of attempted poisoning.

    The incidents—both unsolved as Science went to press—raise the specter of “terrorist elements checking the vulnerability of India's nuclear establishment before a bigger and more deadly attack is mounted,” argues former BARC director A. N. Prasad. Others discount that possibility. “I don't believe [that's the case],” materials scientist Srikumar Banerjee, chair of India's Atomic Energy Commission (AEC), told Science on the sidelines of the India Science Congress here this week. “These are two isolated events.”

    Seeking clues.

    Last month's fire at the Bhabha Atomic Research Centre in Mumbai has put AEC “on a war footing,” says commission chair Srikumar Banerjee.


    Even before the twin incidents, the Department of Atomic Energy (DAE) was on heightened alert. After U.S. terror suspect David C. Headley was arrested in Chicago last October in possession of photos of BARC, DAE ordered a security audit for all nuclear facilities. Now the department has ordered an additional safety review. “We have been doing this on a war footing,” says Banerjee.

    The fatal fire last week was the worst accident in BARC's 55-year history. The facility is home to India's nuclear weapons program—but the fire was “in no way related to the strategic program,” Banerjee says. A DAE spokesperson adds that “no reactor, radioactivity, or radiation was involved in the accident.” However, says Prithviraj Chavan, India's science minister and an AEC member, “we have not yet been able to exactly pinpoint what … caused the fire.”

    The fire apparently was set off by a midday explosion that shook the third floor of BARC's Radiation and Photochemistry Division. Firefighters quickly doused the flames. They recovered from an analytical chemistry lab—the only area that suffered damage—two bodies that had been charred beyond recognition. The victims were later identified as Umang Singh, 25, and Partha Pratim Bag, 24.

    The fire “is baffling,” says Tulsi Mukherjee, director of BARC's chemistry group. The chemistry lab where the fire broke out was “not functional,” as it was being refurbished and had been painted a few days earlier. “There was just not enough incendiary material in the lab to have caused this devastating fire,” Mukherjee says. The lab, he says, housed a spectrophotometer that was turned off, two computers, a nitrogen cylinder—intact after the accident—a laminar flow hood, and small quantities of solvents. “No one heard any screams or shouts of help,” Mukherjee says. Singh and Bag were preparing to study the possible use of herbal extracts for radiation protection, he says, and usually worked with “harmless chemicals.”

    A more bizarre incident occurred on 24 November at the 220-MW pressurized heavy water reactor in Kaiga, some 700 km south of Mumbai. That day, as many as 92 workers drank from a water cooler tainted with tritium-laced heavy water. “It was perhaps the work of some disgruntled employee” who spiked the water cooler, says Chavan. “The area was a security area; no question of anybody from outside coming in.” No workers were harmed by the slight exposures to radioactivity, and all have since returned to work. No arrest has been made so far.

    Like Banerjee, Chavan dismisses a terrorist threat but sees an urgent need for stricter measures at labs across India. For starters, Prasad suggests that authorities conduct more rigorous background checks on personnel.


    From Science's Online Daily News Site

    Five New Exoplanets Discovered Those hoping that the opening plenary talk of the American Astronomical Society meeting would deliver a stunning revelation probably came away disappointed. NASA's Kepler mission has added five new planets to a growing roster of more than 400 beyond our solar system—and none of the newbies is remotely hospitable to life. But there's still plenty to chew on. One of the planets, for example, is as light as Styrofoam—and that has astronomers scratching their heads.

    Mosquitoes: Love at First Buzz How do you mate with the right person if everyone looks exactly the same? That's a problem that faces the Anopheles gambiae complex of mosquitoes, a group that comprises six identical-looking species. The solution, according to a new study, is to find a partner who can sing in perfect harmony with you.

    Read the full postings, comments, and more on

  7. ScienceInsider

    From the Science Policy Blog

    Planes, Boats, and Greenhouse Gas A new report lays out the challenges of reducing greenhouse gas emissions from the airplane and boat transportation sectors. Those sectors make up roughly 3% of global greenhouse gas emissions, but their contribution could increase by 10 times by 2050. Recent moves by the airline industry to study biofuels and even hydrogen-powered airplanes might mitigate such carbon pollution, however.

    Deadline Looms for Earth-Sensing Satellites Congress is demanding that the three agencies that run the National Polar-orbiting Operational Environmental Satellite System provide plans to overhaul the management structure for the troubled system. As of press time, NASA, NOAA, and the Pentagon had yet to file a report lawmakers wanted by 4 January on the costs and management options for the $15 billion system.

    For the full postings and more, go to

  8. Invasion Biology

    From Medfly to Moth: Raising a Buzz of Dissent

    1. Ingfei Chen*

    Twenty years after California's Medfly wars, a vocal critic of government eradication policies is back in a battle over a new invader: the light brown apple moth from Australia.

    Indiscriminate eater.

    The light brown apple moth, an invader in California, feeds on more than 2000 plant species.


    James Carey is at it again. In the early 1990s, as a scientific adviser in California's unpopular pesticide-spraying war against the Mediterranean fruit fly, the entomologist vocally charged that the state's program was fundamentally flawed. Bucking conventional wisdom, Carey claimed that the Medfly was already established, defying the eradication attempts.

    Carey, a professor at the University of California (UC), Davis, then largely vanished from the invasion-biology research scene, gaining prominence instead as an innovative biodemographer who has elucidated universal principles of aging by tracking mortality and reproduction rates in huge populations of insects. He currently directs a $3.4 million federally funded program to investigate the ecology, evolution, and mechanisms of life span and aging.

    Then in February 2007, a voracious new invasive pest—the light brown apple moth from Australia, dubbed LBAM—was identified in Berkeley. The insect's larvae feed on more than 2000 plant species, from apples, grapes, and berries to cypress trees. The California Department of Food and Agriculture (CDFA) kicked into crisis mode to get rid of the moth in northern California, launching a program of aerially spraying a pheromone to disrupt the insect's mating. But those efforts prompted a red-hot public ruckus, forcing the state to shift to a plan to release zillions of sterile moths to achieve the same ends. And once again, Carey has surfaced as a relentless voice of dissent.

    His core argument is essentially the same. Contrary to the agriculture agencies' view that the moth is a new and vanquishable arrival, he thinks it was established long ago and is too widespread to wipe out. The idea of a long-standing invasion can't be discounted, yet it is hard to prove or disprove. But it is Carey's take-no-prisoners style, as much as his bold scientific interpretations, that has riled agriculture officials from Sacramento to Washington, D.C.

    Carey calls the moth-eradication program “a travesty” driven by politics where instead rigorous science should be brought to bear. He says that as a scientist at a public university, he feels a responsibility to render his candid expertise, especially when other entomologists may be reluctant to criticize the agriculture agencies that provide research funding. “I'm not an environmentalist per se, but it just looked like something that was completely wrongheaded,” he says of the LBAM plan. Evidence doesn't support that eradication is feasible—or even needed, he says, because it's unclear that the moth is indeed a major crop pest.

    Carey's admirers say his contrarian views have a place and that he has raised important issues in the invasive-species debates. Others say his insistent criticism has helped derail the state's efforts to deal with the apple moth, resulting in more harm than good. Officials at CDFA and the U.S. Department of Agriculture (USDA) declined to answer any questions from Science about Carey or his scientific critiques.

    Taking a stand in the Medfly wars

    When an exotic pest first pops up on the radar, the great worry is that it may wreak havoc as it outcompetes or devours an area's native species. Agricultural agencies, as well as some ecologists, traditionally have regarded the discovery of a foreign species as a new infiltration. But with the Medfly and LBAM, some invasion biologists like Carey have viewed the new detection as sightings of a population that actually came and settled in earlier.

    It is Carey's radical take, however, that this invasion process may unfold over not months or years but decades to a century, like the slow, long latency growth of a stealthy cancer; most of the doublings of a new exotic pest population remain invisible, at still-tiny numbers that elude trapping surveys. By the time the established population is diagnosed, it's often metastatic and can at best be controlled, not cured.

    Carey came to this model 2 decades ago, after plotting by pencil the locations of every Medfly the state had ever caught on a series of maps—949 flies in 106 cities, with some detections popping up in the same exact neighborhoods, years apart, in a peculiar pattern. It was 1990; near-yearly infestations of the devastating crop pest were plaguing the Los Angeles basin despite aerial malathion sprayings. Carey, who'd been conducting demography studies of the fly, was on CDFA's Medfly science advisory panel.

    CDFA and USDA officials explained the recurrences as new introductions of flies hitchhiking on fruit brought in from abroad. But in testimony before the California Legislature, Carey presented his dissenting view that the state's eradication efforts were failing to fully eliminate a long-resident Medfly population; a rethinking of how to fight the insect was needed. He published his hypothesis in Science in 1991.

    Carey's role in the Medfly debate “was akin to pointing out that the emperor has no clothes,” says Daniel Simberloff, an invasion biologist at the University of Tennessee, Knoxville, laughing. Simberloff's perspective is that Carey “was probably right, in general.” But the agencies rejected Carey's theory, which carried enormous economic repercussions: If the state declared it couldn't eradicate the Medfly, California's multibillion-dollar-a-year farming industry would be embargoed from shipping produce to other countries or would be required to implement costly control measures.

    The agencies also noted that trapping arrays had failed to spot any Medflies after each eradication. Carey insisted that a low-level population was lurking below the radar, but “you can't prove a negative,” says retired USDA entomologist Derrell Chambers, then a colleague on the advisory panel.

    Which theory was right? In 2001 and 2002, genetic analyses of Medflies captured in the '90s found evidence for several separate introductions and for the existence of populations that persisted from one year to the next. “Both things happened, and we need to accept that and learn from it,” says David Haymer, a University of Hawaii, Manoa, geneticist and co-author of one of the studies. Yet, consensus remains elusive on whether those persisting flies represented small, incipient populations or an established one. Carey is sticking to his guns. And CDFA press releases still claim that the state has successfully eradicated every Medfly introduction since 1975.

    Into the moth maelstrom

    Pesky dissenter.

    Jim Carey, here with his old Medfly plotting maps, wants more data and less politics in California's invasive-pest policymaking.


    For those who followed the Medfly wars, the LBAM debate seems like déjà vu. Upon learning that the Davis biodemographer was involved, “I thought, Oh, my God, there's Carey again,” recalls Simberloff, who last summer sat on a National Research Council (NRC) panel that reviewed certain aspects of the controversy at USDA's request (see sidebar, p. 136).

    In late 2007, Carey received an e-mail from a citizens group that was filing a lawsuit to stop the state from spraying pheromone over Monterey and Santa Cruz. Would Carey weigh in on the matter? Given that soon after the moth's discovery, infestations were spotted in nine counties—an area of more than 20,000 square kilometers—Carey believed the invasion was old and too far gone. “There was absolutely no way in the world that they were going to have any chance to eradicate this thing,” he recalls; pheromone spraying and other tools were too weak to do the job.


    He submitted an affidavit to that effect—and thereby leaped into the biggest, most bitterly divisive battle over an invasive species in the Golden State in 2 decades. Any way one slices it, the $89.5 million moth-eradication plan has been a public relations disaster. Besieged by lawsuits and a fierce backlash from a public fearful of anything resembling a pesticide—as well as a mounting debate over just how dangerous a pest the moth really is (see box, right)—CDFA tabled the aerial pheromone treatments after several sprayings. CDFA and USDA believe LBAM arrived recently and was shuffled around via the nursery-plant trade, because prior to 2007, the state's network of moth-luring traps failed to pick up the insect.

    But LBAM is tricky to distinguish from many other nondescript little brown moths, and Carey has said all along that the trapping network was inadequate for detecting this pest's presence. (He also disputes the conventional wisdom that the increased movement of invasive species via trade or human traffic can explain the sudden, widespread appearance of a pest like the moth. If so, the insect should be cropping up in Arizona or other states too, he says.) The NRC panel independently reached the same conclusion after reviewing trapping protocols and data from the state and other sources.

    Carey goes so far as to claim that the moth has been in California for 30 to 50 years. “I'm not kidding,” he says—the invasion process is chronic, insidious, and long undetectable. His estimate is roughly extrapolated from the case of the exotic gypsy moth, which took 4 decades to spread 25,000 square kilometers in the Northeast.

    USDA and CDFA have noted that there are no hard data to support Carey's calculation. There is no way to verify or disprove the estimate, says Ring Cardé, an insect pheromone researcher at UC Riverside and an adviser on CDFA's technical working group on the moth.

    But regardless of whether the invasion started 3 or 50 years ago, others concur with Carey's central point: The insect is too far-flung to eliminate. NRC panel member Simberloff raises his eyebrows at Carey's half-century guesstimate but nonetheless thinks that the moth's sprawling distribution and numbers suggest “it's been here awhile” and was beyond hope of eradication in 2007.

    Despite such criticisms, CDFA and USDA are moving forward on the expensive sterile-insect technique (SIT). USDA has been breeding and sterilizing moths for release as a contraceptive to the insect's reproduction; field testing in Napa and Sonoma vineyards began last November and will continue through late 2011. Carey says the effort, like the pheromone spraying, is “throwing money down a rat hole”; he thinks the state should instead shift to areawide pest management of the moth.

    It will take a couple of years to develop a robust SIT program. By then, even CDFA science adviser Cardé admits that it is uncertain the tool could push back the moth population's boundaries, although it could still help manage low-density infestations or eradicate small outlying pockets. Cardé and the technical working group maintain that had the state been able to carry out the pheromone-spraying program to reduce the moth's population prior to SIT, there was a chance of eliminating it. But now, if the invasion keeps spreading, eradication becomes more and more of a long shot, Cardé says.

    Cardé concedes that Carey's questions about the age of the invasion and whether it can be eradicated are reasonable. But questions that lack definitive answers can sometimes be used by advocates to “completely scotch” agency programs from moving forward—as has happened with LBAM, he says.

    Furthermore, others point out that the agencies typically don't have the luxury of an academic debate when they have to make quick political and financial decisions with billions of dollars in trade at stake. (Even without major moth-inflicted crop damage, the potential for huge trade losses is, again, driving the state's eradication push.)

    Agency officials are now concerned that the moth case may hurt their ability to eradicate future exotic pest invasions that may be far more dangerous to agriculture, Cardé says; opponents may try to employ the same counter-arguments. “It's very worrisome,” he says.

    Carey bristles at Cardé's criticism. Scientists have a responsibility to push back and “not to simply go along, get along” with a flawed eradication plan that will waste tens to hundreds of millions of dollars, he says. And in no case has a pest like the apple moth—so widespread and with so many plant hosts—ever been eradicated, he says.

    Pilot project.

    USDA light brown apple moth program coordinator Gregory Simmons releases thousands of sterile moths into a Napa vineyard last November.


    Others applaud Carey's efforts to inject more science into agency decisions. “His actions represent science at its best as a pursuit of understanding reality, not just fitting a preconceived agenda,” writes Hawaii's Haymer in an e-mail.

    Rethinking the future

    To Carey, the apple moth episode epitomizes a system badly broken. “This pestinvasion paradigm has got to be revisited in a big way,” he says. With new exotic species flooding the state and public opposition to any chemical spraying, eradication isn't always technically or politically possible. The relatively new concept of areawide control, implemented through “pest-free” zones of trade, should be tried instead, he says.

    At UC Davis, the chair of Carey's entomology department, Michael Parrella, is organizing a conference for this spring to reexamine the invasive species–policy paradigm from top to bottom. The goal is an open dialogue with major stakeholders, including USDA and CDFA administrators. Will they come? Parrella acknowledges that Carey's forceful style can be a deterrent. Says Parrella: “It would be nice to think we could sit down and discuss things. It's not us versus them.”

    • * Ingfei Chen is a writer in Santa Cruz, California

  9. Invasion Biology

    Gaps in Moth Logic

    1. Ingfei Chen

    Nearly 3 years after the Australian light brown apple moth was first identified in Berkeley, California, major crop damage has yet to materialize. Some entomologists believe that the state's farmers can control the exotic invader without having to eradicate it.

    Not so bad after all?

    Scientists are debating whether the light brown apple moth truly poses a major threat to California crops.


    When the Australian light brown apple moth was identified in Berkeley in 2007—its first sighting in North America—the California Department of Food and Agriculture (CDFA) and U.S. Department of Agriculture (USDA) rushed into action to eradicate it. They saw a grave emergency for California agriculture, and justifiably so, given the insect's reputation as a voracious, indiscriminate eater with a liking for many of the state's most valuable crop species. USDA, which had long ranked the moth a high-risk pest that required quarantines to prevent its spread, warned that it could be one of the most destructive invaders ever.

    But nearly 3 years later, the insect hasn't lived up to that reputation. More than 257,907 moths have been trapped in 18 counties from Napa to Los Angeles, but they haven't yet made substantial inroads into the prime agricultural Central Valley. Major crop damage hasn't materialized.

    Entomologists Frank Zalom and James Carey of the University of California, Davis, along with former UC Santa Cruz arboretum director Daniel Harder, believe that the state's farmers can learn to live with the exotic invader—perhaps with the same control tools they already use to manage other moths in the same leafroller family. Along those lines, two groups of citizens, including Harder, have petitioned USDA's Animal and Plant Health Inspection Service (APHIS) to reclassify the insect as a minor pest so that costly quarantine measures can be lifted; forget eradication, they say. However, USDA and CDFA maintain that without an eradication plan, or if the pest's status were downgraded, trading partners (and other states) would likely permanently ban or slap restrictions on California produce. (Canada and Mexico have already enacted some restrictions.) Whether USDA can negotiate around such barriers is an issue of debate.

    APHIS drafted a denial to the petitioners' request but also asked the National Academy of Sciences to evaluate the response. In a report last August, a committee of the academy's National Research Council determined that APHIS has the regulatory authority to continue classifying the moth as high-risk. But the panel found that the agency's rationale for that rating—which, others note, is the basis for justifying the state's moth-eradication program—wasn't grounded in sound, rigorous science.

    Specifically, USDA's model predicting a dramatic spread of the insect throughout the southern United States relied upon “questionable” assumptions, says May Berenbaum, a University of Illinois, Urbana-Champaign, entomologist who chaired the review. That geographic projection was then plugged into assessments of potential national economic damages that used “inconsistent and sometimes incomprehensible analytic techniques,” the reviewers wrote. In an extreme scenario, the USDA analysis estimated $9 billion in yearly losses from global trade restrictions.

    Berenbaum says USDA was “between a rock and a hard place” in navigating trade laws and making rapid decisions when little is known about how the moth will behave here. It is to APHIS's credit, she adds, that it asked for the academy's feedback and made some revisions in its response, which has not yet been finalized. An APHIS spokesperson wrote in an e-mail that the agency continues to seek the best predictive models.

  10. Research Institutions

    Plan to Merge Texas Schools Runs Into Faculty Opposition

    1. Jocelyn Kaiser

    Baylor College of Medicine, seeking financial security, is considering joining with nearby Rice University, but Rice faculty members have challenged the plan.

    On the congenial campus of Rice University in Houston, Texas, faculty members in departments such as bioengineering and physics have lately found themselves facing off in a bitter public dispute. The issue: whether Rice, a small research university with deep pockets, should merge with nearby Baylor College of Medicine (BCM), which needs money. Proponents say it is a natural to add a top medical school to Rice's strengths in physical and life sciences. But critics at Rice say that pairing up with BCM is not worth the financial risks.

    BCM, the only private medical school in the U.S. Southwest, is part of a huge complex called the Texas Medical Center that includes a dozen or so hospitals and the M. D. Anderson Cancer Center. Across the street from the complex is Rice, which has about 3300 undergraduates and 2300 graduate students. The two schools already have joint research and education programs, and many Rice undergrads go on to medical school at BCM.

    Officials floated the merger idea 15 months ago as Baylor was seeking to overcome a financial crisis that began in 2004 when it split from the Methodist Hospital System, a key source of clinical income. (The college continues to have teaching partnerships with other hospitals.) Baylor started to build its own hospital but froze construction last March after the project went over budget. Rice administrators argue that Baylor would gain a measure of financial stability by joining the university. The joint operation, they say, would be highly competitive for federal research funding, especially as biology and the physical sciences increasingly overlap. “To us, it is very, very compelling,” says Rice Provost Eugene Levy. Rice officials note that acquiring BCM's more than $210 million in National Institutes of Health support would elevate Rice from 130th to 23rd in the country in federal research funding.

    Marriage proposal.

    Rice (foreground) has offered to tie the knot with nearby Baylor College of Medicine.


    Last summer, a joint Rice-BCM committee came up with a list of “research synergies” that could be pursued. They included creating a neuroscience major and a personalized medicine initiative involving Baylor's NIH-funded human genome–sequencing center. “From our side of the road, it offers great opportunities for collaboration on what is really the future of biomedical research,” says committee member and BCM neuroscientist Michael Friedlander, speaking for himself. (Baylor is granting no official interviews.)

    A Rice-only faculty committee, however, was more cautious about the possible merger, noting that it will require “a substantial one-time investment” from Rice. This panel said the merger should take place only if certain conditions are met, including putting Baylor on a “credible path” to eliminating its deficit, partnering with general-care hospitals, and raising $250 million for Rice programs.

    In November, a public debate on the merger erupted in a series of op-eds and letters to the editor from Rice faculty, some pro, others against. “The costs have been underestimated, and the academic benefits have been hyped,” says Moshe Y. Vardi, a Rice computer science professor who has been a vocal opponent. Others worry that the Rice administration's attention will shift away from physical sciences and humanities. “We could become a very unbalanced institution,” says chemical and biomolecular engineering professor Matteo Pasquali.

    In early December, a group of Rice faculty met and voted 61–59 against a resolution opposing the merger—defeating the faculty opponents only because four administrators exercised their right to vote. A recent survey of Rice's roughly 700 faculty found a similar split: 50% oppose the merger, 39% support it, and 11% are undecided, according to a Rice faculty member with a copy of the results.

    Levy responds that faculty opinion is actually “more nuanced.” He says that if financial concerns can be resolved, there is “substantial support.” Indeed, 53% of faculty would support the merger and 39% oppose it if specific conditions listed by the Rice administration are satisfied, the survey reportedly says.

    Observers expect the Rice board to make a decision before a memorandum of understanding between the two schools expires on 31 January. Vardi, who has pored over an audit of Baylor's books, points out that the document states that without a merger agreement, creditors will soon require the medical school to hire a manager to oversee a cost-cutting reorganization. Baylor faculty seem confident that the college will persevere and build stronger hospital partnerships even without a merger. “It's not a do-or-die thing,” says Friedlander.

  11. Materials Science

    Next Wave of Metamaterials Hopes to Fuel the Revolution

    1. Robert F. Service

    Designing invisibility cloaks may be fun. But for more practical applications of metamaterials, scientists need to find ways to have less of the light absorbed.

    Going vertical.

    Carving intricate structures in polymers (far left) is the first step in creating metamaterials with metallic features (left) that can manipulate light.


    In 2001, researchers in the United States and the United Kingdom pulled off a trick with light that few thought possible: They bent it backward. To be more precise, they altered the refraction pattern by shining microwaves at material made from a circuit board topped with an array of rings and wires. The feat forced physicists everywhere to rethink what they thought they knew about manipulating light.

    The concoction was the first to demonstrate this odd light-bending trick. And its negative index of refraction made it one of the original “metamaterials.” Metamaterials are composites engineered to manipulate electromagnetic waves in new ways. Researchers have designed everything from invisibility cloaks and lenses that focus light to a point smaller than the diffraction limit—the tightest focus possible by conventional optics—to materials that mimic the light-trapping ability of black holes. “There have been tremendous developments in metamaterials over the last 10 years,” says Xiang Zhang, a physicist and metamaterials expert at the University of California (UC), Berkeley.

    But the field faces sizable challenges as well. Despite the researchers' successes in manipulating electromagnetic waves, the fact that most metamaterials are strong light absorbers and ultrathin has hampered their real-world uses. “Conceptually, there has already been a revolution,” says Martin Wegener, a physicist at the University of Karlsruhe in Germany. “Whether it will lead to revolutionary products remains to be seen.” The challenge, Wegener and others say, is to come up with better ways to create complex, three-dimensional patterns in thick materials and limit their tendency to absorb light.

    Active materials

    Centuries ago, glassmakers realized that they could focus light by precisely cutting, grinding, and polishing glass. That discovery led to everything from eyeglasses to telescopes. More recently, fiber optics has made possible modern communications. All of these devices manipulate light based on the chemical composition of the matter through which light travels.

    Conventional optical materials have a positive index of refraction, a measure of the speed of light in different materials. The index change between air and water, for example, is what causes a straw submerged in a glass to appear to bend.

    In the 1960s, Russian theoretical physicist Victor Veselago realized that if materials could be properly engineered, their index of refraction could be negative. If water's refractive index were negative, for example, a straw entering it would appear not just to bend but actually to stick out of the water's surface. Veselago's work implied that flat metamaterials could act like lenses and produce other counterintuitive phenomena, such as a reverse Doppler effect and negative refraction.

    It took more than 30 years, but in 2001, researchers led by John Pendry, a theoretical physicist at Imperial College London, and David Smith, now at Duke University in Durham, North Carolina, made just such a material. They began with an assembly of metal wires and rings, the latter having had a thin slice removed (Science, 6 April 2001, p. 77). When Pendry and his colleagues then shined microwaves on their metamaterial, the microwaves excited electrons in the metal rings, causing them to slosh back and forth. That sloshing produced a resonant magnetic field that affected the propagation of subsequent microwaves, producing a negative index of refraction. It also spurred a tide of related innovations.

    Since then, physicists and materials scientists have had little trouble designing metamaterials that work with radio waves, microwaves, and terahertz waves, all forms of electromagnetic radiation with long wavelengths. That's because the individual components of metamaterials must be smaller than the wavelength of the light they are trying to manipulate. For instance, for microwaves that means in the centimeter range.

    But to manipulate shorter wavelength light, such as infrared or visible rays, researchers must design metamaterials with features on the micrometer or nano scale. That's certainly possible with conventional microchip patterning techniques. But in many cases the patterns that metamaterials makers are trying to make are more complex than microchip patterning can handle.

    Absorption is another big problem. Metamaterials work because incoming light triggers electrons to flow in ways that create a standing magnetic field that affects the propagation of the light waves that follow. Researchers typically use metals, which are good electrical conductors, to make the portions of their metamaterials that carry electron current. But metals are also strong absorbers of visible and infrared light.

    Researchers have partly sidestepped the problem for metamaterials that work in the visible and infrared ranges by using just single ultrathin layers of metals. “The losses are so high right now, so we can only use thin films for metamaterials applications,” says Costas Soukoulis, a condensed-matter physicist at Iowa State University in Ames. But the light must propagate through thicker samples for scientists to be able to observe many unique properties of metamaterials, such as a negative refractive index.

    Better in bulk.

    These 15-layer “fishnet” structures produce a negative refractive index—but only when the light hits them from above.


    Going 3D

    Metamaterials designers have tried a number of approaches to counteract the losses from absorption. Foremost among them are novel, complex 3D metamaterial designs that manipulate the spacing of features in the material. If the geometry can be tailored precisely enough, it can prevent light of certain frequencies from being absorbed. Reporting in the 18 September 2009 issue of Science (p. 1513), for example, Wegener and his colleagues used a method called direct laser writing to cut into a polymer slab an array of intricate helices that turned either clockwise or counterclockwise. They then filled those helical vacancies with gold and removed the polymer to create a 3D array of gold helices.

    When they shined infrared light along the long axis of the helices, the arrays acted like a filter for polarized light, allowing light with certain polarizations to go through while blocking others. There are already polarization filters that work with conventional optics. But the metamaterial version is able to work over a much broader range of frequencies.

    Redesigned rings.

    Sandia researchers are patterning gold split rings on curved polymer surfaces to manipulate incoming infrared light.


    Groups are making progress in designing bulk metamaterials that work with visible light as well. In the 18 September 2008 issue of Nature, for example, Zhang and his UC Berkeley colleagues described a bulk-fishnet structure that had a negative refractive index for near-infrared light. They made the material by stacking alternating layers of silver and magnesium fluoride and used a focused ion beam to cut a fishnet pattern of holes into the stack, leaving behind structures that control the movement of electrons.

    The pairs of conducting and nonconducting layers form a circuit. And the series of circuits created by stacking the layers generate a resonant magnetic field that causes incoming infrared light to have a negative index of refraction. In the 15 August 2008 issue of Science (p. 930), Zhang's group reported achieving a related effect with red light by using an arrangement of vertically aligned silver nanowires grown inside a sheet of porous aluminum oxide.

    But fishnets, helical structures, and most other metamaterials typically work only when the light hitting them comes from a particular direction, or within a narrow range. At a recent meeting,* Michael Sinclair reported that he and his colleagues at Sandia National Laboratories in Albuquerque, New Mexico, are developing a technique to produce metamaterials that work with infrared light coming from virtually any direction. Instead of depositing magnetic features on a flat surface, the researchers evaporate gold through thin slits in a membrane onto curved polymer surfaces. In theory, this should allow them to craft resonant structures with any orientation they choose. The technique has already proven capable of creating gold structures that resonate at a particular frequency, says Sandia team member Bruce Burckel. The next step is to layer the structures to produce the collective behavior needed for a metamaterial.

    Gaining ground

    Other researchers have taken a different approach to overcoming absorption losses. The idea is to create structures that generate additional photons when they are hit with light at a particular frequency, akin to the way lasers produce a swell of photons at a single frequency. In the past few years, several groups have looked into adding metal nanoparticles and other materials to the insulating portion of a metamaterial that surrounds the metallic portions. The goal is to compensate for some of the light absorbed by the metals. For example, in the 9 June 2009 issue of Physical Review B, researchers led by Soukoulis and Wegener reported on a simulation in which tiny metallic particles incorporated in the gaps in split ring resonators produced gains that compensated for metamaterials losses. But to date, they and others have not managed to show the effect experimentally.

    “This doesn't work well yet, but this is a big area of progress,” says Soukoulis. Much the same could be said for the entire field. “It's a pretty rich tool set,” says Sinclair. “People are still exploring what you can do.”

    • * Materials Research Society, Boston, 30 November–4 December 2009.

  12. Archaeology

    Virtual Archaeologists Recreate Parts of Ancient Worlds

    1. Michael Bawaya*

    Using techniques borrowed from the entertainment industry, more and more archaeologists are boosting their imaginations and insights with virtual worlds.

    Palace tour.

    Video game technology lets researchers virtually stroll Nimrud's palace.


    Back in the late 1990s, archaeologist Sam Paley of the University at Buffalo in New York was frustrated in his study of the throne room of the 9th century B.C.E. Northwest Palace at Nimrud, the storied Assyrian capital in what is now Iraq. The room was embellished by paintings and bas reliefs aimed at impressing visitors, but the artwork and inscriptions were dispersed in bits and pieces in 60 museums around the globe, and Paley was having trouble picturing the layout. Then at a conference he heard a presentation by Donald Sanders, a leading proponent of using interactive 3D computer graphics in archaeology, and enlisted Sanders's help.

    The pair spent many years getting photographs from museums and building a virtual 3D model. Finally, they were able to imagine and test detailed hypotheses about the throne room's layout. For example, had there been enough light to see the artwork in the presumed windowless room? Sanders assumed that the Assyrians used oil torches. Different oils produce light in different ranges of the spectrum, and certain types of light accentuate certain colors, so he simulated various types of oils in strategically situated torches. Sanders concluded that the torches could have been fueled by several types of fish oil and positioned to enhance the art so people could have seen it. Today, “you can walk in the palace of a virtual-reality model,” says Paley. A 3D rendering of the model is now on display at the Metropolitan Museum of Art in New York City, and Sanders is at work on a reconstruction of the whole palace.

    The throne room is a classic example of the growth of virtual archaeology, in which archaeologists use computers to recreate the environment and conditions of the past, including objects, buildings, and landscapes with human actors, such as ancient battles. The field is a natural evolution of archaeology in the digital age, says archaeologist Maurizio Forte of the University of California, Merced, who spent 3 years recreating the landscape of Rome in the second century C.E. And although virtual archaeology arose in the mid-1990s, it is only now going mainstream, as archaeologists realize the benefits of using computers to make the most of their necessarily incomplete data.

    As costs go down, virtual archaeology “definitely is on the rise,” says Sanders, with several hundred projects worldwide and plans for a new multimedia journal in the works. Sanders, who has his own company, Learning Sites Inc. in Williamstown, Massachusetts, argues that virtual worlds offer archaeologists the best way to “test complex spatial, behavioral, or temporal hypotheses.”

    Recreating the Four Corners

    As the field continues to develop, a virtual expert “is now a standard member of the archaeological team” in many countries, says Bernard Frischer, an archaeologist and art historian at the University of Virginia, Charlottesville. Sanders says this is partly due to the decreasing cost of the tools of the virtual trade, such as laser scanners. (Sanders's projects cost anywhere from a few thousand to $100,000.) The field also gets a boost from the entertainment industry: The technology is the same as that used in video games and movie special effects, and many universities have recently added 3D modeling programs. “Once [universities] invest in the technology, they have to look for excuses to use it,” says Sanders. Many recent archaeology grads are familiar with virtual techniques, although older archaeologists may not understand the technology as well, says Sanders, leading to “a digital divide.”

    Despite such frivolous roots, virtual experts are setting their sights on some of archaeology's thorniest scientific problems. For example, one of the enduring mysteries of American archaeology is why the Ancestral Puebloan peoples, or Anasazi, abandoned the Four Corners region of the Southwestern United States some 700 years ago, leaving striking cliff dwellings behind. Decades of study have yielded answers including conflict and climate change.

    Researchers with the Village Ecodynamics Project (VEP), led by Tim Kohler of Washington State University, Pullman, and Ziad Kobti of the University of Windsor in Canada, took a different tack to solving the mystery: They virtually recreated a prehistoric world, including everything from landscape to climate to human behavior. They were intent on solving several puzzles, including a cycle of population growth and decline from 920 to 1280 C.E., by which time the Pueblo peoples had left the area. Using archaeological data for variables such as numbers of households, ethnographic data on behaviors such as food sharing, and tree ring and soil data for climate clues, the researchers meticulously recreated part of the Ancestral Puebloans homeland—an 1827-square-kilometer area in southwest Colorado.

    Then they put 200 virtual Pueblo households on the landscape and let them respond to various real-life scenarios, choosing how much corn to grow, how many animals to kill, and so on; their work will be described in a forthcoming book from the University of California Press.

    “A part of the simulation is looking at the economic structure of these societies,” explains Mark Varien, a VEP archaeologist with Crow Canyon Archaeological Center in Dolores, Colorado. If a household couldn't grow enough corn to survive, the simulation shows how they might have coped, for example, by trading with another household or spending more time hunting game. The simulation is not 3D, but the team did put representations of “agents” onto a two-dimensional landscape. “Spatial relationships are really important,” says Kohler, because location was key to determining how a household obtained food, water, and wood. Such simulations “let you look at the interaction between humans and their environment” in a way that traditional archaeology can't,” says Varien.

    On the defensive.

    Simulations explored why the Anasazi built—and later left—their inaccessible cliff dwellings.


    One key result: Households resorted to overhunting deer by 900 C.E. Regardless of the variables incorporated in the simulation, households begin “to seriously deplete deer populations” at that time, says Kohler.

    The simulation also suggests, and archaeological evidence confirms, that turkeys were domesticated at about this time, perhaps because deer were scarce. Another notable result was extensive deforestation, which wasn't clearly seen in the archaeological record. “Without the simulation, you couldn't calculate the effect of people collecting wood every day,” says Kohler.

    Kohler believes that many small social units of the Ancestral Puebloans merged into a single large unit that was less-resilient. The production of maize, the primary food for both people and turkeys, declined sharply around 1270 C.E. due to changes in climate that included drought and cooler summers. This, along with conflict and environmental degradation, led to the exodus.

    Around 1250 C.E., archaeological evidence suggests movement into defensible settings around springs; this is when the cliff dwellings of Mesa Verde National Park were built. The suggestion that the move “might have been due to competition or conflict is strengthened by the simulation results,” says Kohler. The simulations expanded on the existing evidence, revealing details of the unfortunate tale of “lots of people, organized in a way that was highly tuned to competition, heavily dependent on just one resource, and having to cope with widespread violence,” says Kohler. “Things fell apart.”

    The work is an innovative example of what simulations can achieve, says Thomas J. Baerwald of the National Science Foundation, who directs the program that funds VEP. By examining both human activities and the environment, the VEP team has helped reveal the relationship between them, Baerwald says.

    Traveling in time

    In addition to recreating ancient buildings and cultures, virtual archaeologists can go back in time to test hypotheses. For example, many archaeologists believe that the Inka, who fashioned a vast empire during the 15th and 16th centuries in western South America, built large stone pillars to record the sun's location on the horizon during the solstices. Researchers posited that two Inka towers on the Island of the Sun in Lake Titicaca in Bolivia served as markers of the sun's position on the winter solstice at sunset (Science, 9 October 1998, p. 227). But other scholars speculated instead that the towers were used as tombs.

    The solstice hypothesis could be empirically tested only during sunset on or near the June solstice, and the towers are only partially preserved, making verification difficult. So Frischer and colleague Chris Johanson of the University of California, Los Angeles, devised a virtual-empirical test that eliminates the constraints of time and space. They built a 3D model of the topography of the island and the sanctuary. Using astronomical data, they reconstructed the apparent course of the sun at sunset on dates surrounding the winter solstice in the year 1500 C.E. Their model confirmed the solstice hypothesis, Frischer and Johnson wrote in a book chapter last year, by showing that the “solar pillars would have been visible to the masses of devotees standing to the south,” says Frischer. “Once we have the model, we can explore at random,” he says. “We can be like time travelers.”

    Although it can be highly effective, virtual archaeology has its problems, too. Sanders says virtual models are built on so many different software platforms that “there are no standards,” which makes viewing them difficult. Given how easy it is to manipulate a virtual model, there is also the matter of trusting its accuracy. He says researchers should make sure that viewers know the evidence and assumptions behind each model.

    “Virtual archaeology's number-one problem is how to collect, peer-review, and publish all the 3D digital models that scientists are making in increasing numbers each year,” Frischer says. “At the moment, very few of these models, which must number over 1000 by now, are available online. Most—after having been used once for a specific purpose—are sitting in storage on old CD-ROMs and hard disks.”

    Frischer hopes to remedy this problem next year by launching SAVE, a peer-reviewed online journal for virtual models, though he still needs more funds for the venture. He adds that it would be highly ironic if archaeologists, charged with recording and publishing on the world's cultural heritage, left no record of their own virtual work.

    • * Michael Bawaya is the editor of American Archaeology.

  13. Accelerator Physics

    The Next Big Beam?

    1. Daniel Clery

    A long-neglected accelerator technology is making a comeback bid, as its proponents point to possible applications in experimental physics, medicine, and even nuclear power.

    Coming soon.

    The EMMA prototype accelerator is expected to carry its first beam in March.


    If there's one thing you can be sure about with particle accelerators, it's that they're expensive to build. The €3 billion Large Hadron Collider at CERN is the most extreme example. But even at the other end of the scale, a hospital that wants an accelerator for proton beam therapy for cancer patients will likely have to fork out more than $100 million, and neither of the two most common existing technologies—cyclotrons and synchrotrons—is well-suited to the task. Now a handful of accelerator physicists are experimenting with a new type of machine—a cross between a cyclotron and a synchrotron—that avoids many of the shortcomings of both and is simpler and cheaper to build.

    Proponents of these machines, known as fixed-field alternating-gradient (FFAG) accelerators, say they would be ideal for applications such as proton therapy, inspecting the contents of cargo containers, and accelerating muons for a muon collider or neutrino factory. FFAGs may even revive the fortunes of a novel type of nuclear reactor called an energy amplifier, which needs a particle accelerator to drive it. After a modest start in Japan about 10 years ago, the field is “kind of exploding,” says Carol Johnstone of the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois.

    Over the next few months, FFAG fans everywhere will be eagerly awaiting the first beams to whiz around the Electron Model for Many Applications (EMMA), a prototype of a variation on traditional FFAGs that promises to be even simpler and cheaper. A successful demonstration of this so-called nonscaling FFAG, which is under construction at the Daresbury Laboratory in the United Kingdom, could open the floodgates for the application of this technology. “It'll be beautiful if they work,” says Roger Barlow of the University of Manchester in the United Kingdom.

    FFAGs were first proposed in the 1950s, and several electron accelerators were built in the United States. But FFAGs require large, complex magnets to keep particles on track, and the technology lost out to the rival synchrotron, which emerged at about the same time. Most earlier accelerators had been cyclotrons—machines that use a fixed, uniform magnetic field to steer beams of fast-moving particles in a circle. Particles are injected into the middle of the disk-shaped device and once or more per circuit are given a “kick” with an electric field to speed them up. As they accelerate, they spiral outward until they reach the outer edge of the magnetic field and leave the machine. Hence cyclotrons can produce beams of only a single energy, which is limited by the size and strength of their magnets.

    Synchrotrons take a different approach, using a number of variable electromagnets arranged in a ring and ramping up the magnetic fields as the speed of the particles increases. But once you start ramping up the energy, you can't inject more lower-energy particles at the same time. So synchrotrons can reach a higher energy but not a high particle current.

    FFAGs marry the synchrotron's ring of magnets with the steady magnetic fields of a cyclotron. Although an FFAG's magnetic field is fixed in time, it changes in space: as you move farther from the center of the ring the field increases, preventing faster moving particles from spiraling out of the magnets, much as a banked track does for race cars. As long as the field increases strongly enough, an FFAG can achieve higher energies than a cyclotron. And with its fixed fields, you can keep injecting more low-energy particles while higher energy ones are still being accelerated, leading to a higher current than a synchrotron.

    In the 1990s, with improved magnet technology and computer modeling, researchers at the KEK particle physics lab near Tokyo, led by Yoshiharu Mori, began rethinking FFAGs. In 2000 they built a proof-of-principle device with a beam energy of 1 million electron-volts, followed by a 150-MeV machine in 2003. Other researchers took note, and today another six FFAGs have been built, accelerating protons, electrons, and alpha particles. Three others are under construction, and about 20 designs are in development.

    Now there are also variations on the original design. In the late 1990s, researchers in the United Kingdom and the United States began looking at an FFAG for accelerating muons, short-lived particles that need to be brought up to speed very quickly. The rapid acceleration, they figured, would also make possible a radical simplification. In a traditional FFAG magnet, the field increases by a power of the radius as you move away from the center of the ring. This formula keeps the shape of the beam path constant as the beam gains energy and moves outward—the path “scales” with the radius. This scaling suppresses resonances that can throw the beam off line. But this requires very complex magnets. The KEK machines “looked the same as the [1950s] U.S. ones, very dense with big chunky magnets,” says Ken Peach, director of the John Adams Institute for Accelerator Science in the United Kingdom.

    The muon collider team realized that if the beam accelerates quickly enough, the resonance wouldn't have time to build up. Hence they could build magnets that vary linearly with radius and don't scale. Such magnets would be smaller and simpler to make. As a proof of principle, researchers in the United Kingdom and elsewhere designed an electron accelerator based on a nonscaling FFAG. By the time they finished the design in 2005, however, no money was available for their long-sought muon collider.

    Japan weighs in.

    Experimental FFAG at Kyoto University is used in “energy amplifier” research.


    They had more success when they teamed up with some oncologists who saw the FFAG as an ideal machine for proton therapy. Cyclotrons are the standard machines for such treatment, but FFAGs would be cheaper. They could also easily vary the beam energy to penetrate more deeply into tissue, enabling clinicians to scan across tumors in 3D to destroy all cancer cells. “We were convinced there is a role for FFAGs in clinical oncology,” says Peach.

    With this real-world application in prospect, the team won £16 million from the U.K. research councils for a 4-year project known as CONFORM, which will construct EMMA, design a prototype nonscaling proton FFAG for cancer treatment, dubbed PAMELA, and look for other applications. Now well into the project's third year, four of EMMA's seven segments are complete, and researchers expect the machine to carry its first beam on 1 March. Accelerator physicists across the world have high expectations for nonscaling FFAGs, so a lot is riding on EMMA's success.

    Meanwhile, work has been progressing on the design for PAMELA. PAMELA will be able to accelerate both protons and carbon ions, which some studies suggest may be even better than protons for cancer treatment. “It turned out that moving from an electron to a carbon FFAG is not so simple,” says Peach. As a result, PAMELA may end up somewhere between a scaling and a nonscaling FFAG, with magnetic fields that don't vary linearly with radius but that are still simpler than in a traditional FFAG.

    Barlow says the United Kingdom's National Health Service is about to call for bids to build two proton-therapy centers. FFAGs won't be ready in time for those jobs, but he thinks they could form a second generation. “I'm increasingly positive about proton therapy,” he says. But Johnstone cautions that medical administrators are conservative and reluctant to back new technologies. “It's hard to break into that market,” she says.

    Quick fix.

    In scaling FFAGs, particles take same-shaped paths to avoid beam instabilities. Newer FFAGs accelerate particles faster than such trouble can arise.


    Johnstone is, however, working on another application that may come to fruition sooner. In collaboration with the company Passport Systems Inc. based near Boston, she has designed ultracompact electron FFAGs whose beams will be used to generate x-rays for scanning cargo containers for explosives, nuclear materials, or other contraband. A prototype scaling FFAG has already been built, and a nonscaling version is in the cards for 2010. “There's nothing like this on the market,” she says. “You can throw them in the back of a truck.”

    The CONFORM project has discussed a more ambitious application: the accelerator-driven subcritical reactor (ADSR), otherwise known as an energy amplifier. This technology, first proposed by Nobel physicist Carlo Rubbia in 1993, starts with a reactor containing slightly too little nuclear fuel to sustain a chain reaction. Instead, nuclear reactions are helped along by an external source of neutrons: a powerful particle accelerator that fires protons into a heavy metal target, knocking out neutrons. An ADSR would produce less high-level waste than a conventional reactor does. It is also inherently safe: Nuclear reactions can't keep going without the external neutrons, so to turn off the reactor you just turn off the beam.

    The biggest question mark is the accelerator. Those built so far have been one-off research machines: too expensive, temperamental, and low-powered to keep a nuclear reactor running. The planned accelerator for the European Spallation Source will likely cost €400 million and produce a 5-megawatt (MW) beam. An ADSR would need twice that power. The CONFORM researchers realized that PAMELA's design could be adapted to meet the requirements of an ADSR. “As soon as we looked at the ADSR, it looked like a goer,” says Robert Cywinski of the University of Huddersfield in the United Kingdom.

    Meanwhile, Japanese researchers were already thinking along similar lines. Over the past few years, the KEK team has put together the world's first ADSR experiment. Using the Kyoto University Critical Assembly (KUCA) as their reactor, they assembled a cascade of three FFAGs boosting a proton beam first to 2.5 MeV, then to 20 MeV, and finally to 100 MeV. The researchers made KUCA subcritical by lowering control rods into the uranium core; then, on 4 March 2009, they fired neutrons into the reactor. Measurements clearly indicated that the neutron beams were triggering sustainable nuclear reactions. Mori says that FFAGs show potential for being able to drive an ADSR, but “it would be necessary to overcome many problems and difficulties, such as beam losses, operational efficiency, reliability, safety, and so on.”

    The U.K. researchers are taking a different tack with plans to fuel their reactor with thorium. Thorium is more abundant in the earth than uranium is, and its fuel cycle doesn't produce material that could be diverted into nuclear bombs. The team has made a case for a £300 million, partly publicly funded project to develop the ADSR technology over 5 years and is waiting for the government's response. Just over half the money will pay for a series of prototype FFAGs to demonstrate that they are powerful, reliable, and affordable enough to be practical in an ADSR. The rest of the funding will cover materials research and simulation of the proposed reactor. If that government investment is followed by up to £2 billion from industry, Cywinski says the project could produce a working 600-MW prototype by 2025. “It could create for the U.K. a new nuclear export industry. You could sell these to all those countries you can't sell conventional nuclear,” says Cywinski.

    Even in the early days of the FFAG's revival, researchers are thinking big. But for the moment, all eyes are on EMMA.