News this Week

Science  20 May 2011:
Vol. 332, Issue 6032, pp. 900
  1. Around the World

    1 - Berlin
    Ethics Commission Calls For Swift Nuclear Phaseout
    2 - Cameroon, Chad, Nigeria
    More of New Meningitis Vaccine for Africa
    3 - Washington, D.C.
    NIH Grant Applicants Face Low Odds
    4 - Melbourne, Australia
    Synchrotron Funding Up in the Air
    5 - London
    Astronomy, Particle Physics Cuts Decried
    6 - Paris
    Goodbye, Rinderpest

    Berlin

    Ethics Commission Calls For Swift Nuclear Phaseout

    Germany should phase out nuclear power by 2021, according to a leaked draft of a report from the Ethics Commission on Safe Energy Supply created by Chancellor Angela Merkel in the wake of the Fukushima catastrophe. The commission, chaired by two scientists and comprising representatives from industry, research, politics, and religion, recommends permanently shutting down the country's seven oldest nuclear reactors, followed by “a complete withdrawal from nuclear energy” within 10 years or less.

    The seven reactors were taken off line shortly after problems started at Fukushima; the commission says this demonstrates that the 8.5 gigawatts they produce can be obtained elsewhere. Renewable energy, natural gas, and coal can take care of the rest without increasing Germany's carbon dioxide emissions or shaking the economy, the report says.

    Reactor 1 at Philippsburg Nuclear Power Plant was one of seven German reactors shut off in March.

    CREDIT: LOTHAR NEUMANN, GERNSBACH/WIKIPEDIA

    A law enacted in 2002 would have closed all of Germany's 17 nuclear power plants by 2023, but Merkel's coalition passed a new law last autumn that delayed that phaseout by more than a decade. The leaked report could change significantly before the official version is released at the end of the month. In any case, it is expected to strongly influence the direction the government takes.

    http://scim.ag/german-nuclear

    Cameroon, Chad, Nigeria

    More of New Meningitis Vaccine for Africa

    An affordable, effective vaccine against deadly bacterial meningitis A will be rolled out in three more African countries this year, thanks to $100 million in new funds from the GAVI Alliance, a public-private global health partnership. Massive immunization campaigns will start in Cameroon, Chad, and Nigeria by December, ahead of the dry season when epidemics sweep across the so-called meningitis belt of Africa. MenAfriVac is the first vaccine developed specifically for Africa, at a price poor governments can afford—roughly 50 cents per dose. Since its initial introduction last year in Burkina Faso, Mali, and Niger, cases have dropped dramatically. “Men A has disappeared from Burkina Faso,” says Marc LaForce, head of the Meningitis Vaccine Project, which developed the vaccine. LaForce hopes the donor community will commit enough money so that all 25 countries in the belt are covered by 2015.

    Washington, D.C.

    NIH Grant Applicants Face Low Odds

    U.S. National Institutes of Health (NIH) Director Francis Collins warned last week that the fraction of NIH grant applications that are funded this year could drop to 18% or 17%, the lowest percentage ever. The crash in so-called success rates comes because NIH's budget was cut by $322 million, or 1%, in last month's budget agreement for the current fiscal year. Collins's warning was part of his testimony to the Senate appropriations subcommittee that oversees NIH's budget. Subcommittee chair Senator Tom Harkin (D-IA) also worried about next year, noting that if a House of Representatives proposal to cut health spending by 9% prevails, “severe reductions to NIH research would be unavoidable.” http://scim.ag/low-odds

    Melbourne, Australia

    Synchrotron Funding Up in the Air

    Australia's budget plan doesn't address funding for the Australian Synchotron.

    CREDIT: DANIEL MENDELBAUM/STATE OF VICTORIA

    When the Australian government released its new budget proposal last week, it didn't contain the drastic cuts many scientists had feared. But anxiety is still rippling through users of the Australian Synchrotron. Neither the federal government nor the Victoria state government, which also released budget plans recently, addressed how the world-class facility will be funded beyond June 2012, when its original 5-year financial plan ends. The bulk of the facility's funding comes from the two governments. Andrew Peele, a physicist at La Trobe University in Victoria who has been head of science at the synchrotron since late last year, says the facility's leadership had applied for additional government funding to expand the facility. He's hopeful a decision will be made soon. Long-term funding “is always an issue for major research institutions, [and] there are a number of alternatives,” he says. http://scim.ag/oz-synch

    London

    Astronomy, Particle Physics Cuts Decried

    A U.K. parliamentary committee has warned that budget cuts planned for astronomy and particle physics will jeopardize Britain's ability to stay at the forefront of those disciplines.

    Although science in general was relatively well protected from severe government spending cuts announced last year, astronomy and particle physics took a big hit. A report released last week by the House of Commons Science and Technology Committee says the two fields will have 50% less funding in 2014–15 than they did in 2005.

    The committee was particularly scathing about the decision to withdraw support from all ground-based optical and infrared telescopes in the Northern Hemisphere, noting that British astronomers will lose access to cutting-edge instruments and the country's reputation as an international partner could suffer. “The idea that subjects like astronomy and particle physics do not provide immediate economic returns and therefore can be sacrificed at the altar of cutbacks is nonsense,” says committee chair Andrew Miller. Welcome words for Britain's big-science supporters, but the impact will be limited: the committee has no direct legislative power. http://scim.ag/UKastro

    Paris

    Goodbye, Rinderpest

    Animal health officials are about to report that rinderpest, which has decimated cattle herds for millennia, has been eradicated worldwide after a 17-year vaccination effort. On 25 May in Paris, the World Organization for Animal Health will certify the last eight countries, which include Micronesia, Sri Lanka, and Kazakhstan, as free of the disease. Then on 28 June in Rome, the governing body of the United Nations' Food and Agriculture Organization is expected to issue a declaration of global eradication. Rinderpest, a viral disease, was once endemic across Eurasia and Africa with periodic outbreaks killing calves and devastating herders. The virus was last detected in 2001 in Kenya. It will be the first animal disease to be eradicated and only the second disease in human history after smallpox.

  2. Random Sample

    They Said It

    “Climate change is occurring, is very likely caused primarily by the emission of greenhouse gases from human activities, and poses significant risks for a range of human and natural systems. Each additional ton of greenhouse gases emitted commits us to further change and greater risks.”

    —The National Research Council Committee on America's Climate Choices, in its fifth and final report to the United States Congress on climate change.

    Spacetime Souvenir

    CREDIT: NATIONAL MUSEUMS LIVERPOOL

    On 26 May 1933, a wild-haired German physics professor stepped off a ferry from Oostende, Belgium, and handed this landing card to border authorities in Dover, England. This year, museum curators on an artifact-hunting trip to a border agency office near Heathrow Airport found the card, signed by Albert Einstein, among the personal collections of immigration officers.

    Einstein “already had his Nobel Prize,” so the border agent who stamped it “might have thought, ‘Ooh, I'll keep this,’” says Lucy Gardner, assistant curator at the Merseyside Maritime Museum in Liverpool, where the card went on display 10 May. On the card, Einstein records his nationality as Swiss; he had left Germany, where the Nazis had denounced his theory of relativity and burned his books, in 1932. He lists Oxford, where he would give a series of lectures, as his expected address. Later that year, Einstein immigrated to the United States, never to return to Germany.

    The View From Here

    CREDIT: NICK RISINGER, SKYSURVEY.ORG

    Ever wonder what the night sky would look like if all the stars grew thousands of times brighter? Go see for yourself at www.skysurvey.org.

    The Photopic Sky Survey is the labor of love of amateur astronomer Nick Risinger, who quit his job as a marketing director in Seattle, Washington, and spent a year lugging sensitive cameras almost 100,000 kilometers around the United States and to South Africa. He then stitched together the 37,440 images to create the largest ever true-color sky survey—a zoomable, 360° panorama of the universe from Earth's point of view.

    “Its main value is as an educational tool,” Risinger says. But Jonathan McDowell, an astrophysicist at Harvard University, says astronomers could use the survey's 5000 megapixels to identify objects to observe. “The human eye is very good at spotting things that are unusual in shape or color,” he says. “And of course, it's very pretty.”

    By the Numbers

    <10 — Stars in the constellation Orion visible to 59% of 2188 participants in a U.K. assessment of light pollution. Just 1% of star watchers had skies dark enough to make out more than 30 stars.

    $796 billion — The economic impact of the Human Genome Project on the U.S. economy over the past decade, according to a report commissioned by the Life Technologies Foundation. The Human Genome Project cost the U.S. government $3.8 billion.

    $225 million — Donation to the University of Pennsylvania School of Medicine by Philadelphia philanthropists Raymond and Ruth Perelman, one of the largest gifts ever given to a medical school.

    From Bench to Big Screen

    CREDIT: COURTESY OF VALERIE WEISS

    You think your Ph.D. was tough? Harvard Medical School grad student Samantha Bazarick's cell biology project has been stuck for a whole year. Meanwhile, her lab is populated by petulant postdocs and abusive professors, and her love life is in shambles. Her carefully planned ascent to scientific glory is in a tailspin.

    But don't worry—Samantha is fictional. She's the main character in Losing Control, a new film by former biophysicist Valerie Weiss. (Spoiler alert: The plot echoes recent intrigues involving scientists at Harvard Medical School.) The film is now on the festival circuit and will make a stop at the National Academy of Sciences in Washington, D.C., in October (www.losingcontrolmovie.com).

    Weiss herself finished a Ph.D. at Harvard in 2001, but she says the film is not based on her own experience. “I actually had a wonderful Ph.D.,” she says. “It's supposed to be a screwball comedy.” Nonetheless, Samantha's nightmare scenario will resonate with many suffering scientists.

    Science and Hollywood aren't so different, Weiss says. “The motivations are the same: stories that are new and fresh.” And getting funding for your work is just as great a preoccupation and time drain. But there are advantages to her new profession. “In science you have to come back to the hard data,” she says. “But as an independent filmmaker, I can follow my imagination anywhere.”

  3. Tohoku-Oki Earthquake

    Fukushima Revives The Low-Dose Debate

    1. Dennis Normile

    The general public avoided exposure to high levels of radioactivity, but questions linger about the long-term effects of contamination.

    Hot job.

    Technicians check radiation hourly in a gravel lot in Fukushima City. Exposure has dropped but remains 35 times above background.

    PHOTO CREDIT: D. NORMILE/SCIENCE; GRAPH SOURCE: FUKUSHIMA PREFECTURE

    FUKUSHIMA, JAPAN—At 5 p.m. sharp, Mitsuru Itou watches as a technician steps inside a quartet of orange traffic cones and black-and-yellow traffic bars marking a “keep out” area in a gravel lot. Holding a radiation meter at his waist, the technician waits for the instrument to stabilize. Then every 30 seconds for the next 2½ minutes he recites the count. Itou, a supervisor with the Public Health and Welfare Office of Fukushima Prefecture, correctly predicts that the readings will average 1.6 microsieverts per hour. “That's what [the radiation] has come down to for some time now,” he says. The results are phoned in to a disaster center, which posts them to its Web site. This ritual is repeated every hour at seven locations across the prefecture to track environmental radiation from the Fukushima Daiichi nuclear power plant, 63 kilometers southeast of Fukushima city. The measured levels range from two to 1000 times normal background radiation—and residents, officials, and scientists wonder what that may mean for public health.

    The magnitude-9.0 earthquake and tsunami on 11 March knocked out nuclear fuel cooling systems at the power plant. In the days that followed, overheating triggered hydrogen explosions that spewed radioisotopes into the air. Radiation spiked 4 days after the first explosion, according to measurements here and at other ground-monitoring sites hastily set up after the earthquake. Since then, radiation levels have ebbed as short-lived radionuclides, such as iodine-131 with a half-life of 8 days, decay into stable isotopes.

    Across Fukushima and neighboring prefectures, small amounts of cesium-134 and cesium-137, isotopes with half-lives of 2 and 30 years respectively, lie on the ground. Cleanup workers have stripped contaminated topsoil from some schoolyards, and remediation or permanent evacuation is likely for the worst areas. But for much of the prefecture, “we're stuck, there are no practical countermeasures,” says Hisashi Katayose, a Fukushima official in charge of radiation monitoring.

    As a result, several thousand of Fukushima's 2 million residents have been thrust into the middle of a vigorous scientific debate about the health effects of long-term exposure to low levels of radiation. “We're all guinea pigs,” says Akira Watanabe, a meteorologist who is vice president of Fukushima University here. A central question is whether there's a threshold below which radiation has no ill effect. “Dose threshold is a very contentious issue in the radiation community,” says radiation epidemiologist Roy Shore, research head at the Radiation Effects Research Foundation in Hiroshima. Some researchers believe even unavoidable background radiation can be a factor in causing cancer. Others argue that tiny doses of radiation are not harmful. Some scientists even claim that low doses, by stimulating DNA repair, make you healthier—an effect known as hormesis.

    Studies in Fukushima could help clarify the picture. But getting answers will not be easy. Radiation exposure levels for most people were elevated so minutely above background that it may be impossible to tease out carcinogenic effects from other risk factors, such as smoking or diet. “In order to detect an elevation in risk, one needs to study much larger numbers of people,” Shore says, especially given that 40% of all Japanese develop cancer.

    That has experts wondering whether and how to carry out such studies. “It is difficult to say at this point, especially since the crisis is not over,” says Shunichi Yamashita, a radiation health expert at Nagasaki University who is advising the Fukushima prefectural government. Forging ahead with a population study, as daunting as it may be, could nevertheless have a scientific payoff. “If you do the study and don't find anything, that should be an important message,” says Dale Preston, a biostatistician specializing in radiation health effects at Hirosoft International in Eureka, California.

    Where to draw the line?

    Perhaps the sole point on which scientists agree is that radiation damages DNA in ways that can cause cancer many years after exposure. When radioisotopes lodge in certain organs—such as iodine-131 in the thyroid gland—the constant bombardment of surrounding tissue can overwhelm repair mechanisms and trigger cancer.

    The clearest insights come from decades of follow up on survivors of the atomic bombings of Hiroshima and Nagasaki. These studies have linked an acute dose of 100 millisieverts (mSv) of radiation—16 times the amount that an individual receives on average from all sources over the course of a year—to a 1.05 times increase in the chance of developing some form of cancer. Children with similar exposures appeared to have a higher risk of developing cancer later. These risks scaled linearly as exposures increased.

    But the health effects of chronic low-level radiation exposure over years or decades are far from clear. Several large cohort studies of medical x-ray technicians and nuclear industry workers suggest a slight increase in cancer risk at exposures below 100 mSv, Shore says. To err on the safe side, most radiation protection agencies follow the linear no-threshold model, which posits that risk diminishes with decreasing exposure but that any increase above background poses a cancer risk. Extrapolating from this model to estimate health effects in a population “is not wise because of the uncertainties,” Shore says.

    Opportunities to narrow uncertainties have been missed. In the aftermath of the Soviet Union's April 1986 Chernobyl nuclear accident, which spewed radionuclides over a swath of Europe, “there was no continuity, no overarching panel looking at how science should be done,” says Ronald Chesser, a radiation biologist at Texas Tech University in Lubbock. The subsequent Soviet collapse, scarce funding, imprecise dosimetry, and difficulties tracking people over the years have limited the number of studies and their reliability, he says. The United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) concluded in a 2008 report that over 6000 cases of thyroid cancer in young people could be linked to Chernobyl but that evidence for other cancers was inconclusive. To resolve outstanding questions, on 26 April the World Health Organization's International Agency for Research on Cancer in Lyon, France, asked the international community to support a Chernobyl Health Effects Research Foundation to conduct life-span studies, similar to those following A-bomb survivors in Japan.

    Animal studies have yielded conflicting data. Laboratory experiments on animals indicate that as doses decrease, less and less damage escapes DNA-repair mechanisms, says Yoshihisa Matsumoto, a radiation biologist at the Tokyo Institute of Technology. “There must be some threshold below which the damage is completely repaired,” he says. Chesser says some of his group's studies of mice exposed to radioactivity around Chernobyl hint at hormesis: Small exposures over 10 to 45 days, they found, appeared to temper damage from an acute radiation dose delivered in the lab later. He thinks the reaction to low doses could be quite complex. “There's not going to be a uniform response of all biological functions to low levels of radiation,” Chesser predicts.

    Patchy contamination

    Japan's experience tracking A-bomb survivors, an early start gathering data on environmental exposures in Fukushima, and a family registry system that tracks virtually all individuals all offer “great advantages” in devising more definitive low-dose studies, says Preston, who believes such a study would be well worth the cost. “I think we will learn something important,” he says.

    The 800 or so workers who have helped bring the Fukushima reactors under control will be included in an ongoing study of nuclear industry workers by the Tokyo-based Radiation Effects Association. Many workers are getting higher doses in weeks than they would have received on the job over a year. Fukushima residents facing higher than background exposure can blame an unfortunate shift in the prevailing winds. Just after the 11 March disaster, much of the radioactive contamination from the reactor complex was swept out to sea. But on 15 March, a counterclockwise wind carried contamination back over the prefecture, Watanabe says. “And then it rained.”

    Authorities are keeping a close vigil on the patchy contamination. In addition to the sampling by Fukushima Prefecture, the education ministry is monitoring radiation levels nationwide, including at 100 locations in the prefecture. A team from Fukushima University recently mapped radiation levels at 370 spots in the prefecture and, using weather balloons, confirmed that atmospheric radiation levels have dropped to near background levels.

    That broad-brush impression of radioactive contamination of the landscape isn't sufficient for population studies. Shore says it will be important to reconstruct exposures to identify a cohort with the highest exposures. Researchers also need to ascertain where people were during the peak exposure period and where they obtained food and drinking water. Any robust study would also include detailed medical histories and information on smoking habits, diet, and possible exposure to other toxicants, as well as matched controls with little or no exposure. That information “would make possible an informed long-term cohort study,” Shore says.

    Aloft.

    Fukushima University's Akira Watanabe is leading an effort to map radiation in the air and on the ground.

    CREDIT: D. NORMILE/SCIENCE

    Estimating individual doses from environmental data is neither easy nor precise. An alternative technique was developed by a team led by David Brenner, a radiation biophysicist at Columbia University Medical Center in New York City, to plan a response to a radiation release by terrorists. The method rapidly screens blood samples for fragments of DNA and DNA-repair complexes; exposures are calculated based on the number of fragments.

    Scientists hope a respected entity will organize a high-quality research plan involving all levels of government. Fukushima Medical University is bidding for that role. A spokesperson has confirmed that the university will establish a research initiative with support from radiation health experts at Nagasaki and Hiroshima universities. Details may be released next month.

    Some researchers doubt that any study in Fukushima, no matter how well devised, will reveal much. The radiation exposure of the general population “is too small to give a statistically significant increase in stochastic effects such as cancer,” argues Ohtsura Niwa, professor emeritus of radiation biology at Kyoto University. But even negative data would complement UNSCEAR's conclusions on Chernobyl, Niwa says, “and, in this sense, have global implications.” As for the linear no-threshold model, Preston says, “I don't think anything [done in Fukushima] is going to resolve that debate.”

    One real effect of the radioactive contamination is the gnawing fear—groundless or not—that low levels of radiation could harm their children. For that reason alone, Yamashita says, “a center or some sort of system to support long-term health follow-ups is definitely necessary.”

  4. Tohoku-Oki Earthquake

    Schoolyard Radiation Policy Brings a Backlash

    1. Dennis Normile

    The Japanese government's most controversial misstep in response to the Fukushima nuclear power plant crisis may have been the release of guidelines on allowable radiological contamination in schoolyards.

    TOKYO—The Japanese government has made a number of missteps during the 2-month-long Fukushima nuclear power plant crisis. But the most controversial may have been the release of guidelines from the education ministry on allowable radiological contamination in schoolyards. They seem to allow children to accumulate radiation exposures of 20 millisieverts (mSv) over the course of a year. By comparison, nuclear industry workers in Japan can absorb no more than 100 mSv per year; the limit for U.S. nuclear personnel is 50 mSv per year.

    The 19 April announcement unleashed a torrent of criticism from civic groups and experts. “Setting [such radiation limits] for elementary schools is inexcusable,” Toshiso Kosako, a radiation health expert at the University of Tokyo, said on 30 April, when he resigned as an adviser to Prime Minister Naoto Kan on the nuclear crisis. Because children are known to be more susceptible than adults to risks of cancer from radiation, Physicians for Social Responsibility, a U.S. antinuclear proliferation group, condemned the exposure limit as “unconscionable.”

    Dirty dirt.

    Fukushima schools are stripping contaminated topsoil from playgrounds.

    CREDIT: KYODO/LANDOV

    The ministry has backpedaled—but not fully retreated. On 11 May, it released suggestions for removing contaminated topsoil from schoolyards to reduce radiation exposures. But it did not change or retract the exposure guidelines.

    In its “provisional idea” for acceptable levels of radiation in schoolyards, the education ministry cited a 2009 recommendation from the International Commission on Radiological Protection (ICRP), an Ottawa-based nongovernmental organization. During emergencies, ICRP Publication 109 states, populations can be exposed to 20 to 100 mSv per year. The education ministry calculated that children could spend 8 hours a day in a schoolyard exposed to as much as 3.8 microsieverts per hour, and 16 hours a day indoors exposed to 1.52 microsieverts per hour, and not exceed the 20-mSv limit. Civic groups contend that the education ministry should follow another ICRP recommendation, which states that exposure limits for long-term residence in contaminated areas after an accident should be kept “in the lower part of the 1-20 mSv/year” range.

    In the past few weeks, several schools took matters into their own hands and stripped topsoil from their grounds. On 11 May, the ministry jumped on that bandwagon, announcing test results showing that swapping the top 10 centimeters of topsoil with dirt from deeper down cut surface radiation 90%. Stripping and burying the topsoil in a deep hole reduces surface radiation 99%. The ministry is leaving final decisions on what to do in the hands of local officials.

  5. Tohoku-Oki Earthquake

    Crippled Reactors to Get Cooled and Wrapped

    1. Dennis Normile

    The crisis at the stricken Fukushima Daiichi nuclear power plant is far from over. Some 100,000 residents who were evacuated will not return home until the reactors are firmly under control.

    TOKYO—The crisis at the stricken Fukushima Daiichi nuclear power plant may have faded from the headlines, but it's far from over. To cope with the loss of reactor cooling systems knocked out by the 11 March earthquake and tsunami, the plant's owner, Tokyo Electric Power Co., has installed jury-rigged cooling setups that have cut radiation emissions dramatically. But some 100,000 residents who were evacuated will not return home until the reactors are firmly under control. Last month, Tokyo Electric unveiled a two-stage plan to build more robust cooling systems and reduce radiation leaks within 3 months, then, 3 to 6 months later, achieve a cold shutdown in which fuel is cooled by water below the boiling point at atmospheric pressure.

    Off limits.

    Robots are finding radiation too high for humans.

    CREDIT: TOKYO ELECTRIC POWER CO.

    Nuclear fuel in four of the plant's six reactors overheated, with extensive core damage in three units. Last week, a robotic inspection increased suspicions that the fuel in unit 1 may have melted through the bottom of the pressure vessel and pooled at the base of the containment structure. Hydrogen explosions completely blew the upper walls and roofs off two units and severely damaged a third; vessels and piping are leaking contaminated water. “There are many challenging tasks ahead,” says Tony Irwin, a nuclear technology expert at Australian National University and the University of Sydney. Workers must reduce radiation levels, plug leaks, and decontaminate water—all while the threat of aftershocks persists.

    In the early days, Tokyo Electric hoped to restart Fukushima's original cooling systems. But the company was forced to explore alternatives, says Hidehiko Nishiyama, deputy director of Japan's Nuclear and Industrial Safety Agency. The utility is now planning to build heat exchangers that will circulate fresh water through the reactors to cool the fuel. To seal off reactor buildings, engineers are planning to wrap them in polyester sheets stretched over steel frames.

    The biggest challenge, Nishiyama says, is protecting workers. Some have been entering the unit 1 building to prepare for construction. But so far, only robots have entered the unit 2 and 3 reactor buildings, where radiation levels top 50 millisieverts per hour. Tokyo Electric may expand the use of robots, which so far have been limited to taking radiation measurements and videos. Because integrated circuits can be affected by radiation, these probes must be primarily mechanical or have hardened electronics. Once new cooling systems and enclosures are in place, work could start on the semipermanent buildings needed for recovering nuclear fuel and decommissioning the reactors, a process that could take a decade or longer.

  6. Seismology

    New Work Reinforces Megaquake's Harsh Lessons in Geoscience

    1. Richard A. Kerr

    High-tech analyses of Japan's March earthquake overturn long-held views of fault behavior and warn that another disaster may be looming.

    The moment the Tohoku-Oki earthquake struck off northern Japan on 11 March, many researchers knew their expectations had been shattered. The great offshore fault could not be counted on to behave at all predictably. And using onshore observations to gauge whether an offshore fault is building toward failure has grave limitations.

    Now three papers (http://scim.ag/MSimons, http://scim.ag/S-Ide, and http://scim.ag/M-Sato) published online this week in Science help show why the inevitable release of seismic energy failed to play out as expected and why monitoring from afar fell short. The papers also point to a possible huge quake to the south, closer to Tokyo. Seismologists are concerned, says Mark Simons of the California Institute of Technology in Pasadena, but they are also now acutely aware of their limitations. “We have no idea what's going on” to the south, he says, but they're anxious to find out.

    A game of ring toss.

    March's huge quake (yellow contours) and past smaller quakes (colored loops) have left a patch of threatening fault (question mark).

    CREDIT: ADAPTED FROM MARK SIMONS ET AL., SCIENCE (2011)

    Many seismologists had thought that the offshore fault north of Tokyo was fairly simple and uniform. The ocean plate diving beneath Japan, the thinking went, should stick and slowly build up enough stress to rupture the fault. And the fault should fail segment by segment in large but not huge earthquakes. That's how the fault seemed to have behaved in recent centuries, with quakes of magnitude 7 to 8 or so popping off on any one segment every few decades or few centuries.

    But it turns out that the fault has more than one mode of operation. The three Science papers gauge where and by how much the fault slipped in the 11 March magnitude-9.0 quake. Simons and his colleagues combined seismic data recorded around the world, crustal movements on Japan recorded by GPS, and tsunami waves recorded at buoys at sea. Satoshi Ide of the University of Tokyo and colleagues compared the seismic signature of the magnitude 9 with that of its largest foreshock, a magnitude 7.3. And Mariko Sato of the Japan Coast Guard in Tokyo and colleagues actually measured the motion of the sea floor before and after subsea GPS observations. “You can't have a better recorded earthquake,” says David Wald of the U.S. Geological Survey in Golden, Colorado, who was not involved in any of the studies.

    The three studies and unpublished estimates by other groups suggest that during the quake, the descending ocean plate and the overlying plate carrying Japan slipped past each other by as much as 50 to 60 meters. “Those are enormous slips,” Wald says, running about two to three times the maximum slip reported for the magnitude-8.8 Maule, Chile, quake of last year. But the pattern of slip is equally striking. Five contiguous segments of the fault spanning more than 600 kilometers broke at once in the quake, rather than one or at most two, as scientists had assumed. But only two central segments experienced extreme slip, and that high slip was concentrated far offshore on the shallower part of the fault (within the figure's yellow contours). Historic quakes had broken short segments of the deeper part of the fault nearer land (the loops of various colors).

    Obviously, the fault is more complicated than most researchers had assumed. Simons and his colleagues suggest that some irregularity on the fault is to blame. Something—perhaps a seamount on the sinking plate—pinned the high-slip patch of fault in place for 500 or 1000 years, they argue, while patches around it failed repeatedly in smaller quakes. The apparent absence of quakes in the stuck patch led many seismologists to assume that the fault there could be slowly but steadily slipping without building up any strain. And their only means of monitoring the buildup of strain on the fault—GPS measurements of slow ground movement on land—was greatly handicapped by having the stuck patch 150 kilometers offshore. With such a limited perspective on the past release and the current buildup of strain, a magnitude-9 quake caught researchers by surprise.

    Learning that most of the March megaquake's slip was concentrated on two segments makes scientists more worried about other faults around the Pacific. “If you can get a 9 that is this compact,” Wald says, “it increases the number of places you can [fit in] a 9 where you may not have expected one.”

    All eyes are now on the southern portion of the length of fault that broke in the Tohoku quake. Neither historical quakes nor the Tohoku quake has broken the offshore, shallow portion of the fault there. And the Tohoku rupture transferred stress southward along the fault, abruptly increasing the stress there. As had been the case to the north, researchers can't say for sure whether that portion of the fault (marked by the question mark) has been freely slipping without generating quakes or locked and building strain toward a quake.

    If the offshore southern portion is indeed stuck, Simons and colleagues see “the possibility of a sibling to the 2011 event” that could be “similar to what just occurred offshore,” but half as far from Tokyo. So researchers are anxious to find out whether the stress transferred southward from the 9 has accelerated slow slip on the fault and thus defused the threat of a quake. If the fault isn't slipping, another quake would be in the works. Speed is of the essence: A magnitude-8.7 sibling quake followed the 2004 Indian Ocean megaquake by 3 months.

  7. Seismology

    Seismic Crystal Ball Proving Mostly Cloudy Around the World

    1. Richard A. Kerr

    Failing at quake prediction, seismologists tried making fuzzier forecasts, but Japan's megaquake is only the latest reminder of the method's shortcomings.

    Out of the blue.

    Students at California State University, Northridge, ponder the destruction wrought by a quake on an unrecognized fault.

    CREDIT: MARK J. TERRILL/AP IMAGES

    When a devastating megaquake rocked the region north of Tokyo in March, nobody saw such a huge quake coming. “Japanese scientists are among the world's best, and they have the best monitoring networks,” notes geophysicist Ross Stein of the U.S. Geological Survey (USGS) in Menlo Park, California. “It's hard to imagine others are going to do forecasting better. No one group is doing it in a way we'd all be satisfied with.”

    In China, New Zealand, and California as well, recent earthquakes have underscored scientists' problems forecasting the future. A surprisingly big quake arrives where smaller ones were expected, as in Japan; an unseen fault breaks far from obviously dangerous faults, as in New Zealand. And, most disconcerting, after more than 2 decades of official forecasting, geoscientists still don't know how much confidence to place in their own warnings. “We can't really tell good forecasts from the bad ones” until the surprises arrive, Stein says. But improvements are in the works.

    Simple beginnings

    The current focus of earthquake prognostication research represents a retreat from its ambitious aims of a few decades ago. In the 1960s and '70s, seismologists worked on prediction: specifying the precise time, place, and magnitude of a coming quake. To do that, scientists needed to identify reliable signals that a fault was about to fail: a distinctive flurry of small quakes, a whiff of radon gas oozing from the ground, some oddly perturbed wildlife. Unfortunately, no one has yet found a bona fide earthquake precursor. By the time the 2004 magnitude-6.0 Parkfield earthquake—the most closely monitored quake of all time—struck the central San Andreas fault without so much as a hint of a precursor (Science, 8 October 2004, p. 206), most researchers had abandoned attempts at precise prediction.

    Off the mark.

    A magnitude-9 quake (star and blue band) struck far north of the zone considered to have the greatest seismic hazard (red).

    CREDIT: ADAPTED FROM THE EARTHQUAKE RESEARCH COMMITTEE, NATIONAL SEISMIC HAZARD MAPS FOR JAPAN (2010), HEADQUARTERS FOR EARTHQUAKE RESEARCH PROMOTION

    Parkfield did mark an early success of a new strategy: quake forecasting. Rather than waiting for a warning sign, forecasters look to the past behavior of a fault to gauge future behavior. They assume that strain on a fault is building steadily and that the same segment of fault that broke in the past will produce a similar break again in the future, once it reaches the same breaking point. Instead of giving the year or range of years when the next quake will strike a particular segment of fault, they express it as a probability.

    USGS issued its first official earthquake forecast for the San Andreas in 1988 (Science, 22 July 1988, p. 413). Parkfield, which had a long record of similar quakes at roughly 22-year intervals, rated a 99% probability of repeating in the next 30 years. That turned out to be a success for the 1988 forecast. And the southern Santa Cruz Mountains segment, which last slipped in the 1906 San Francisco quake, was given a 30% chance of failing again within 30 years, which it did in 1989.

    Since then, the 1988 San Andreas forecast has had no more hits, but it has missed plenty of serious California seismic activity. The reason: lack of data. Quake forecasting “is highly conditioned on the information available,” says William Ellsworth, a seismologist at USGS's office in Menlo Park. Ellsworth served on USGS's Working Group on California Earthquake Probabilities (WGCEP) that issued the 1988 forecast. Given the sorely limited knowledge of California fault history, WGCEP limited itself to the best-known fault segments, mainly on the San Andreas fault. It ignored not only a plethora of unrelated faults but also most branches of the San Andreas that slice through populated regions such as the San Francisco Bay area.

    The limitations of that narrow focus became clear while the group was still reaching its conclusions. At its 23 November 1987 meeting, members decided that the particularly skimpy information on one fault, the Superstition Hills fault in far Southern California, prevented them from making a forecast. Within hours, a magnitude-6.5 quake struck that fault, followed 11 hours later by a magnitude-6.7 quake on a previously unknown, intersecting fault. There would be other surprises on little-known faults: the 1992 magnitude-7.3 Landers quake off the southern San Andreas (1 killed, $92 million in damage); the 1994 magnitude-6.7 Northridge earthquake on a previously unknown, buried fault (60 killed, $20 billion in damage); and the 1999 Hector Mine quake, magnitude 7.1, in the remoteness of the Mojave Desert.

    Modern prognostication

    But earthquake forecasting “has hardly stood still over 20-plus years,” Ellsworth notes. Scientists now have much more information about faults both major and minor; just as significant, they have developed a much keener sense of how to proceed when data are lacking. “To do seismic hazard right, we have to acknowledge we don't know how Earth works,” says seismologist Edward Field of USGS in Golden, Colorado, who is leading the next official forecast for California, due in 2012.

    In the past, Field notes, official forecasters would take their best guess at how a fault works—for example, what length of fault will break and how fast the fault slips over geologic time—and feed it into a single forecast model, which would spit out a probability that a particular quake would occur in the next 30 years. WGCEP followed that approach for its 1988 and 1995 forecasts.

    In the late 1990s, however, it became clear that a single model wasn't enough. “There's no agreement on how to build one model,” Field says, “so we have to build more than one to allow for the range of possible models.” Forecasters merged 480 different models for the 2007 California forecast to produce a single forecast with more clearly defined uncertainties. They aren't done yet. Current models cannot produce some sorts of quakes that have actually occurred, Field says, including the sorts that struck New Zealand and Japan.

    Those two quakes are “perfect examples of what we need to fix,” Field says. In New Zealand, last September's magnitude-7.1 Darfield quake struck a previously unknown fault that probably hadn't ruptured for the past 15,000 years. Scientists were aware that a quake of that size was possible in the region. New Zealand's official forecast estimated that “background” seismicity would give rise somewhere to one about once in 10,000 years. The New Zealand forecast “is doing its job,” says seismologist Mark Stirling of New Zealand's GNS Science in Lower Hutt—but in this case, the information was too vague to be of much use. (Once the quake had occurred, statistical forecasting based on the size of the main shock did anticipate the possibility of its largest aftershock: a magnitude-6.3 quake in February that heavily damaged older structures in Christchurch.)

    Mind the red.

    The southern San Andreas bears the highest probability (red) of a big California quake.

    CREDIT: ADAPTED FROM THE SOUTHERN CALIFORNIA EARTHQUAKE CENTER/NSF/USGS

    In the case of the Tohoku earthquake, the culprit was an “unknown unknown.” Japanese seismologists preparing official Japan forecasts have assumed that the offshore fault running the length of the main island north of Tokyo had largely revealed its true nature. “I thought we really understood the Tohoku area,” says seismologist James Mori of Kyoto University in Japan. “Five hundred years seemed to be a long enough quake history given the [large] number of earthquakes.”

    Drawing on a centuries-long history of quakes of magnitude 7 to 8 rupturing various parts of the fault, members of the official Earthquake Research Committee had divided the offshore fault into six segments, each roughly 150 kilometers long, that they expected to rupture again. They assigned each segment a probability of rupturing again in the next 30 years; the probabilities ranged from a few percent to 99% in recent official forecasts.

    Official forecasters had not included the possibility of more than two segments rupturing at once. They knew that two adjacent segments seemed to have broken together in 1793 to produce a magnitude-8.0 quake. And at their meeting in February, they also considered geologic evidence that a big tsunami in 869 C.E. had swept several kilometers inland across the same coastal plain inundated in March. But they concluded they had too little data to consider what sort of earthquake would result if more than two segments failed at once, says seismologist Kenji Satake of the University of Tokyo, who served on the committee. “I don't think anyone anticipated a 9.0,” Satake says.

    In the event, five fault segments failed in one magnitude-9.0 quake (see p. 911). (The disastrous 2008 Wenchuan quake in China also grew to an unimagined size by breaking multiple fault segments.) In Japan, “what happened was a very improbable, 1000-year event,” Ellsworth says. The most dangerous earthquakes tend to be rare, and their very rarity, unfortunately, makes them hard to study.

    In spite of the obstacles, improvements are on the way. Previous official forecasts for California considered each fault segment in isolation, and they ignored aftershocks such as those that caused so much damage in New Zealand. By contrast, Field says, the next forecast, due in 2012, will allow the possibility of ruptures breaking through onto adjacent segments, such as a rupture on a side fault breaking onto the San Andreas itself. And it will also accommodate aftershocks.

    Seismologists are also beginning to test their models against new earthquakes as they occur. Because larger quakes are rarer than smaller ones, their forecasts take longer to test. (Testing on California quakes for the past 5 years, for example, gets you only up to magnitude 5.) So it pays to cast as wide a net as possible. “The best strategy is testing around the globe,” Stein says. He is chair of the scientific board of the GEM Foundation, a nonprofit based in Pavia, Italy, which is developing a global earthquake model to make worldwide forecasts. By including the whole world, Stein says, the model should enable scientists to test forecasts of big, damaging earthquakes in a practical amount of time.

  8. History of Science

    The Alchemical Revolution

    1. Sara Reardon

    As cryptic manuscripts and centuries-old labware yield their secrets, scholars are coming to realize that medieval "chymists" were real scientists after all.

    Exploded myth?

    Revisionist historians hope to change alchemists' image as delusional buffoons.

    CREDIT: THE ALCHEMIST'S EXPERIMENT TAKES FIRE, 1687/HENDRICK HEERSCHOP, DUTCH/OIL ON CANVAS LAID DOWN ON BOARD/GIFT OF FISHER SCIENTIFIC INTERNATIONAL/COURTESY OF THE CHEMICAL HERITAGE FOUNDATION COLLECTIONS/PHOTOGRAPH BY GREGORY TOBIAS

    For possibly the first time in 2 millennia, a chemist has used an ancient formula to transmute silver into gold. The secret, a solution called “divine water,” was in an ancient Greco-Egyptian metalworking manuscript originally written on papyrus and preserved in a mummy wrapping. Following the recipe exactly (lime, sulfur, and the “urine of a youth” combined and heated “until the liquid looks like blood”), Lawrence Principe mixed chemicals under a fume hood, heated the solution over a Bunsen burner, dropped in a silver Canadian Maple Leaf coin, and watched, pleased, as the coin turned yellow.

    It wasn't real gold, of course. Principe, who holds dual Ph.D.s in organic chemistry and history of science, says the layer of gold-tinted oxidation on the coin's surface might be useful for making metal ornaments look more expensive. But if the metal's color could be changed, a 3rd century thinker might have surmised, then why not its other properties? Could any base metal be transmuted entirely into gold?

    Those were reasonable questions for the time, Principe believes. “Science doesn't progress ever forward in one grand, royal road,” he says. “It's a twisted, thorny labyrinth with multiple pathways.” Yet alchemy is certainly a thorn in the side of historians: an unwelcome reminder of science's foray into magic.

    Principe and a growing number of other science historians, however, hold that alchemists—“chymists” is their preferred, less-loaded term—were serious scientists who kept careful lab notes and followed the scientific method as well as any modern researcher. He tests that hypothesis by recreating their experiments in a sunny storage closet repurposed as a small lab at Johns Hopkins University in Baltimore, Maryland. If the alchemists saw what they claimed, he says, then it's high time for an “alchemical revolution” to restore them to scientific respectability.

    Garage alchemy.

    Working at home, William Newman replicated a chymist's possible glimpse of atomic theory.

    CREDIT: ISTOCKPHOTO; COURTESY OF BILL NEWMAN

    In the view of these advocates, alchemists have been unjustly ranked with witches and mountebank performers, when in fact they were educated men with limited tools for inquiring into the nature of the universe. The mystical stories that shroud their writings suggest that they were busy recording spiritual visions. But the truth is more complex: As concerned as modern patent applicants about having their secrets stolen, chymists often coded their protocols behind a tapestry of arcane metaphors, allegories, and drawings. Their royal patrons encouraged such obscurity, worrying that successful transmutation of metal into gold would devalue their currency. And too much clarity could prove fatal at a time when falsely claiming success at transmutation might be punishable by death.

    If the lives of the chymists weren't hard enough, in the late 17th century as the European Enlightenment took the stage, a rising class of scientific professionals began a deliberate campaign to smear the entire discipline. In a talk to Leiden University professors in 1718, the Dutch physician Herman Boerhaave apologized in advance about his topic. “I must talk about chemistry!” he said. “About chemistry! A subject disagreeable, vulgar, laborious, [and] far from the affairs of intelligent people.” Well-known scientists such as Isaac Newton and Robert Boyle dabbled in chymistry at their peril; their work was often hidden from other scholars or suppressed, only to resurface in the 20th century. While chemistry eventually regained its good name, alchemy remained a bête noire among historians of science for centuries. Until recently, peer-reviewed journals refused to publish papers on the topic.

    “The way alchemy was presented in the early 1980s was a parody, partially created in the 18th century and added to by people who didn't read the sources and tried to recraft the sources into their own ideas,” Principe says. “But what was the daily activity of an alchemist? When he got up each morning and went into his workshop, what was actually in the flasks that he held? What was he thinking about?” That's the question Principe says he's been working on for 30 years, “and I'm still trying to answer it. It's maybe a bit obsessive.”

    Obsession is what it takes: Even after cutting through all the symbolic coding, recreating experiments is difficult. “We talk about lots of these processes as though they were easy, while they actually involve a lot of tacit knowledge,” says William Newman, a historian at Indiana University, Bloomington, who also works on chymistry re-creations—some of them with a furnace in his own garage. Considering that even the best post-Renaissance experimenters distilled phosphorus from urine, melted silver from whatever coins they might be carrying, and used inexact heat sources, their results were difficult, if not impossible, for them to reproduce. “You have to back-engineer to understand how the theory integrates with the practice,” Newman says. “There's no better way to do that than to do the experiments themselves.”

    Principe's current pet project is to understand the glow of the Bologna stone: a legendary rock that, when placed in a fire, was one of the first recorded examples of natural phosphorescence. By chance, a 17th century cobbler who had put a piece of barite in his fire stumbled upon the perfect combination of factors to light it, but re-creating the process stumped centuries of chymists. Following clues from a manuscript by the 17th century natural philosopher (and alchemist) Wilhelm Homberg, Principe went to Italy and retrieved bits of barite from a field that is now a skeet-shooting range outside Bologna. In a replica he built of Homberg's oven, and measuring parameters such as temperature, pressure, and gas exchange, he's gotten them to glow faintly—just as described 300 years ago. “As I read Homberg's description, both of the stones and how you work with them, I never understood it at a really deep level until I had done it myself,” Principe says.

    Re-creating experiments, the historians say, helps describe how the mysteries that teased early chymists gave rise to modern science. For example, Newman traces atomic theory directly back to the early 17th century German chymist Daniel Sennert. The existence of atoms had been proposed on metaphysical grounds centuries earlier, but Sennert was the first to infer it experimentally. While researching transmutation, he found that silver could be re-isolated after being dissolved in nitric acid—evidence, Sennert concluded, that metals are made up of irreducible “corpuscles.”

    Relics of science.

    Flasks and beakers from 16th century labs in Austria's Oberstockstall monastery reveal what alchemists were brewing.

    CREDIT: COURTESY LAWRENCE PRINCIPE; MARCOS MARTINÓN-TORRES

    “A lot of scientific laws that were formulated as late as the 19th century were actually in play much earlier than we had imagined,” says archaeologist Marcos Martinón-Torres of University College London. “We easily dismiss things chymists did as superstitious, but when you look further into it, they have a lot more ingenuity than we credit them for.”

    Alchemist nouveau.

    Ancient manuscripts gave Lawrence Principe the recipe for “divine water” that can turn a silver coin into faux gold.

    CREDIT: K. D. KUNTZ

    For instance, as early as the 14th century, many alchemists believed that their experiments would work only if their crucibles were made in the Hesse region of Germany. While excavating labs in Austria, Martinón-Torres dug up broken shards of such crucibles and analyzed their chemical makeup using scanning electron microscopy, x-ray diffraction, and other imaging techniques. It turned out there was truth behind the lore: In a 2006 paper in Nature, Martinón-Torres revealed that Hessian potters in the 15th century knew how to make a highly heat-resistant ceramic component now called mullite. The secret, later lost and not rediscovered until the 1920s, enabled chymists to conduct technically demanding experiments.

    Martinón-Torres also analyzed traces of the chemicals the crucibles had held. For the most part, he says, the results agree with the chymists' notes. “They never discovered transmutation, but they discovered modern experimental science instead,” he says—and with it tangible byproducts such as cosmetics, pigments, and medicines.

    These new realizations have brought a swarm of students to the field and inspired a growing number of international conferences on alchemy. “I can't even keep up with the literature now,” Principe says. History is being rewritten as scholars unearth neglected manuscripts, outing a growing number of early scientists as closet alchemists.

    On one such foray, Principe recently struck a glimmer of gold. He found buried in a Russian military archive an unpublished chymistry textbook by Homberg, the official chemist of the French Royal Academy. The manuscript had been hidden since 1716; Principe spent 7 years tracking it down. In it, Homberg claimed to have discovered the philosopher's stone: the fabled element that would transmute base metals into gold.

    “He was an alchemist!” Principe says with glee. Homberg had covertly searched for the secret of transmutation with the Duc d'Orleans and described his successful method in his textbook, much to the chagrin of the nascent academy. “The academy is funded by the crown; they're publicly the most visible intellectuals in France. You don't want them dealing with something that has this bad reputation,” says Principe, who is currently writing a biography of Homberg. “That's probably why his final manuscript was never published.”

    So what happens if Principe manages to re-create Homberg's last protocol right down to the pièce de résistance? “If I do, you can visit me in my lavish villa somewhere and ask me about it,” he told an audience at the 2011 meeting of AAAS (publisher of Science). “I won't tell you about it, but I can offer you a glass of wine and we can talk about something else.”

Log in to view full text