News this Week

Science  27 Jan 2012:
Vol. 335, Issue 6067, pp. 384
  1. Around the World

    1 - Cambodia, China, Indonesia, and Vietnam
    H5N1 Continues to Kill
    2 - Tokyo
    University Considers Shifting Academic Year
    3 - Tissint, Morocco
    Mars Scientists Ecstatic Over ‘Fresh’ Mars Rocks
    4 - United Kingdom and Israel
    Partnership With Israeli Firm Stirs U.K. Museum Protest
    5 - São Paulo, Brazil
    Space Chief Appointed New Science Minister
    6 - Bozeman, Montana
    Moon Orbiters Dubbed 'Ebb' and 'Flow'

    Cambodia, China, Indonesia, and Vietnam

    H5N1 Continues to Kill


    H5N1 deaths are linked to sick birds.


    Even as controversy over a lab-created strain of the H5N1 bird flu virus rages, outbreaks of the natural strain continue in poultry in much of south and southeast Asia—and the human death toll is mounting. Since the beginning of the year, the virus has killed individuals in Cambodia, China, Indonesia, and Vietnam, and there has been a case of human infection in Egypt, according to the World Health Organization (WHO) and local press accounts. WHO reports that most of the recent human cases can be traced to contact with sick birds.

    Since 2003, avian influenza has proved fatal in 343 of the 582 humans infected with H5N1. That high death rate portends a devastating pandemic if the virus evolves the ability to pass easily between humans. Earlier this year, two labs created a strain of H5N1 that might do just that, spreading from ferret to ferret with ease (Science, 6 January, p. 20). The teams have responded to the international uproar over the dangers of the studies by agreeing to halt related research for 60 days to allow for an “international discussion” (see p. 387).


    University Considers Shifting Academic Year

    Worried about being left behind in the globalization of higher education, the University of Tokyo is laying plans to shift the start of its school year from April to autumn—in 5 years or so.

    Japan's April-to-March academic year is out of sync with the majority of universities around the world and is “one big reason” the country lags in both attracting international students and in sending its own youth abroad, said Takao Shimizu, a university biochemist who heads a committee recommending the switch.

    The panel's preliminary report, released 20 January, notes that last year a mere 1.9% of undergrads at Todai, as the university is called in Japan, were international students. This compares with 10% at Harvard University and 7% at Stanford University.

    Shimizu's committee will gather comment from within and beyond the university before finalizing its recommendations by the end of March. Todai doesn't want to go it alone, however. Officials there intend to launch discussions with 11 other top Japanese universities interested in the idea.

    Tissint, Morocco

    Mars Scientists Ecstatic Over ‘Fresh’ Mars Rocks

    New arrival.

    Freshly-fallen Mars meteorites are enticing scientists.


    With samples hand-carried from Mars still decades away, researchers are rushing to look for signs of life in the latest arrivals from the Red Planet, the Tissint meteorites seen blazing through the atmosphere over Morocco last July. “I'm throwing everything I have at” the meteorites, says microbiologist Andrew Steele of the Geophysical Laboratory of the Carnegie Institution for Science in Washington, D.C.

    Just last week, a committee of meteorite experts associated with the Meteoritical Society tentatively confirmed a martian origin for the meteorites. Meteorite dealers, who retrieved 7 kilograms of meteorites from the Moroccan desert in December, immediately shared samples with scientists in order to verify the meteorites' origin. Now that they've been verified as martian, slivers of these rocks are selling at tens of thousands of dollars an ounce.

    Since the excitement in the late 1990s over purported signs of ancient life in Mars meteorite ALH84001, Steele had reported finding remnants of microorganisms in that famous rock. But he also showed that those were contamined from the rock's 10,000 years on Earth. With only 6 months spent in the desert, Moroccan Mars meteorites look far more promising.

    United Kingdom and Israel

    Partnership With Israeli Firm Stirs U.K. Museum Protest

    The Natural History Museum in London is under fire for its scientific cooperation with an Israeli cosmetics company, Ahava/Dead Sea Laboratories, which conducts research in the occupied West Bank. On 17 January, The Independent newspaper published a letter from 21 prominent intellectuals, including scientists, musicians, authors, and film directors, calling on the museum to end its E.U.-funded collaboration with the firm on a nanoparticle research study.

    Though officially headquarted south of Tel Aviv, Ahava's laboratories and factory are located in Mitzpe Shalem, a Jewish settlement on the banks of the Dead Sea, about 10 kilometers north of the recognized border between Israel and the Palestinian Territories. The United Nations and the International Court of Justice have said that Israel's settlements in the occupied West Bank are illegal. Israel, however, has disputed that characterization.

    Although the project will end in November, the letter writers say it's an issue still worth raising as the European Union reconsiders its criteria for future funding. “The main reason for doing it was to warn other people” who might be considering working with the company, says Patrick Bateson, president of the Zoological Society of London, who signed the letter.

    São Paulo, Brazil

    Space Chief Appointed New Science Minister

    Moon mappers.

    Ebb and Flow will map the moon's gravity.


    Marco Antônio Raupp, currently president of the Brazilian Space Agency, will become his nation's new minister of science, technology, and innovation on 24 January. Raupp, 73, has more than 40 years of experience in research, a sharp contrast from his predecessor Aloizio Mercadante, an economist and leading figure in the governing Worker's Party who never received wide support from Brazil's scientific community.

    A native of the relatively developed south of the country, Raupp graduated in physics from the Federal University of Rio Grande do Sul and got his Ph.D. in mathematics from the University of Chicago. He has taught at universities in Brazil's three most important cities: São Paulo, Rio de Janeiro, and Brasília.

    Raupp said he was honored by his selection by Brazil's president and called it “a crucial moment” in the ministry's development. He said in a statement: “I am totally aware of the unprecedented demands being made on science, technology and innovation to contribute as an essential part of Brazil's social and economic development.”

    Bozeman, Montana

    Moon Orbiters Dubbed 'Ebb' and 'Flow'

    Smash a (retroactive) champagne bottle on them. On 17 January, NASA's twin spacecraft—formerly dubbed Gravity Recovery and Interior Laboratory, or GRAIL A and B—were officially christened Ebb and Flow.

    The monikers come thanks to Nina DiMauro's fourth-grade class at Emily Dickinson Elementary School in Bozeman, Montana, which wrote the winning essay in a contest to name the orbiters. If Ebb and Flow bring to mind the lunar pull on the tides, that's the point: DiMauro's pupils noted that they wanted to evoke the moon's most noticeable effect on Earth. “GRAIL-A and GRAIL-B are on a journey just like the moon is on a journey around the Earth,” they wrote.

    The Bozeman students beat out nearly 900 other classrooms across the country to name the probes, which launched last September and will begin their scientific mission in March. Ebb and Flow will keep an eye on the moon's gravity, mapping subtle changes across ridges and valleys.

  2. Random Sample

    Lost and Found After 165 Years


    A collection of “lost” fossils, some collected by Charles Darwin during his famous voyage on the HMS Beagle, went on display online at the British Geological Survey 17 January. Last year, Howard Falcon-Lang, a researcher at Royal Holloway, University of London, stumbled on the collection of 314 glass slides gathering dust bunnies in a wooden cabinet in the corner of the Geological Survey. Botanist Joseph Hooker, a close friend of Darwin's, originally assembled the collection in 1846 for the British Geological Survey. However, by the time the survey instituted a formal specimen register in 1848, Hooker had left to explore the Himalayas and was unavailable to help catalog his work. The slides contain micrometer-thin slices of fossilized wood and plants, early examples of thin sections that reveal the internal structures of monkey-puzzle tree cones (shown), petrified wood, and giant club mosses.

    A New Way for Journals To Peer Review Papers?

    Carsten Rahbek, editor-in-chief of Ecography, has a problem facing many journal editors. “Due to an explosion in the number of submitted papers, we have major problems finding people to review, and the quality has gone down as well,” he says.

    So Rahbek's journal, published by Wiley, has joined Peerage of Science, an online social network that aims to provide journals with already-peer-reviewed manuscripts. Recently founded by three Finnish ecologists, the Web site accepts paper submissions—and matches them with potential reviewers: Scientists with a potentially publishable paper can upload it to the Web site, while other members with relevant expertise, alerted by keywords in the papers, can provide reviews that scientific journals can use to decide whether to offer to publish the work.


    Janne-Tuomas Seppänen, a postdoc at University of Jyväskylä, came up with the idea for Peerage of Science in 2010. Scientists receive one credit for every review they finish, whereas uploading a manuscript costs two credits divided by the number of authors. The author who uploads the paper must have a positive balance. “This formalises an unwritten rule: He who wants his manuscripts reviewed, reviews other manuscripts in return,” says Seppänen. Since November, four manuscripts have received reviews.

    Peerage of Science is currently free for journals, but its founders plan to charge fees that they say will be below what it costs a journal to handle its own peer review. Rahbek has so far encountered one article of interest to Ecography—which he decided not to publish. “But,” he says, “I liked the procedure and the quality of the reviews.”

    By the Numbers

    18% — Percentage of reviewed research grant proposals that were funded by the National Institutes of Health in 2011, a record low.

    5.7 million to 6.7 million — Number of bats that have died from white nose syndrome since the disease was first documented in 2006, according to a new estimate by U.S. Fish and Wildlife Service biologists.

  3. Newsmakers

    Crafoord Prizes Announced

    Clockwise from top left: Bourgain, Tao, Ghez, and Genzel.


    The Royal Swedish Academy of Sciences announced 19 January the four winners of the 2012 Crafoord Prize in the fields of mathematics and astronomy.

    The two awardees in mathematics were Terence Tao of the University of California, Los Angeles (UCLA) and Jean Bourgain of the Institute for Advanced Study (IAS) in Princeton. Reinhard Genzel of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, and Andrea Ghez of UCLA will claim the prize for astronomy.

    In 2004, Tao posited the Green-Tao theorem, which states that it's possible to create sequences of prime numbers of any length by adding any integer progressively to a starting number. For instance, in the sequence 5, 17, 29, 41, each prime is equally spaced by 12. Bourgain has proved several groundbreaking results on “well-posedness” of nonlinear differential equations, such as the Schrödinger equation of quantum mechanics and the Korteweg–de Vries equation of wave propagation.

    Genzel and Ghez, are arguably the Coke and Pepsi of researchers studying supermassive black holes. Their groups independently produced compelling evidence that such a feature, 4 million times more massive than the sun, sits at the center of the Milky Way.

    Smithsonian Director to Take Over Wildlife Conservation Society


    Cristián Samper, director of the Smithsonian National Museum of Natural History, is shifting gears in August to become the president and CEO of the Wildlife Conservation Society (WCS), a $200 million conservation organization that runs the Bronx Zoo, other New York City zoos, and wildlife programs in 65 countries.

    Samper started his career in conservation and says he is eager to get back to that focus. The natural history museum director since 2003, Samper helped transform the museum, bringing in 25 new curators, starting an endowed fellowship program, and redoing two-thirds of the exhibits, including the mammals, ocean, and human origins halls. In 2007, he took over temporarily for Smithsonian Institution Secretary Lawrence Small, who resigned amidst controversy. Samper again became the natural history museum's director after Wayne Clough was hired as the institution's secretary 15 months later.

    Samper leaves the museum with a “major legacy,” says Clough. “It's a great gain for WCS,” says Jonathan Coddington, the museum's associate director for research and collections.

  4. Computer Science

    What It'll Take to Go Exascale

    1. Robert F. Service

    Scientists hope the next generation of supercomputers will carry out a million trillion operations per second. But first they must change the way the machines are built and run.

    On fire.

    More powerful supercomputers now in the design stage should make modeling turbulent gas flames more accurate and revolutionize engine designs.


    Using real climate data, scientists at Lawrence Berkeley National Laboratory (LBNL) in California recently ran a simulation on one of the world's most powerful supercomputers that replicated the number of tropical storms and hurricanes that had occurred over the past 30 years. Its accuracy was a landmark for computer modeling of global climate. But Michael Wehner and his LBNL colleagues have their eyes on a much bigger prize: understanding whether an increase in cloud cover from rising temperatures would retard climate change by reflecting more light back into space, or accelerate it by trapping additional heat close to Earth.

    To succeed, Wehner must be able to model individual cloud systems on a global scale. To do that, he will need supercomputers more powerful than any yet designed. These so-called exascale computers would be capable of carrying out 1018 floating point operations per second, or an exaflop. That's nearly 100 times more powerful than today's biggest supercomputer, Japan's “K Computer,” which achieves 11.3 petaflops (1015 flops) (see graph), and 1000 times faster than the Hopper supercomputer used by Wehner and his colleagues. The United States now appears poised to reach for the exascale, as do China, Japan, Russia, India, and the European Union.

    It won't be easy. Advances in supercomputers have come at a steady pace over the past 20 years, enabled by the continual improvement in computer chip manufacturing. But this evolutionary approach won't cut it in getting to the exascale. Instead, computer scientists must first figure out ways to make future machines far more energy efficient and tolerant of errors, and find novel ways to program them.

    “The step we are about to take to exascale computing will be very, very difficult,” says Robert Rosner, a physicist at the University of Chicago in Illinois, who chaired a recent Department of Energy (DOE) committee charged with exploring whether exascale computers would be achievable. Charles Shank, a former director of LBNL who recently headed a separate panel collecting widespread views on what it would take to build an exascale machine, agrees. “Nobody said it would be impossible,” Shank says. “But there are significant unknowns.”

    Gaining support

    The next generation of powerful supercomputers will be used to design high-efficiency engines tailored to burn biofuels, reveal the causes of supernova explosions, track the atomic workings of catalysts in real time, and study how persistent radiation damage might affect the metal casing surrounding nuclear weapons. “It's a technology that has become critically important for many scientific disciplines,” says Horst Simon, LBNL's deputy director.

    That versatility has made supercomputing an easy sell to politicians. The massive 2012 spending bill approved last month by Congress contained $1.06 billion for DOE's program in advanced computing, which includes a down payment to bring online the world's first exascale computer. Congress didn't specify exactly how much money should be spent on the exascale initiative, for which DOE had requested $126 million. But it asked for a detailed plan, due next month, with multiyear budget breakdowns listing who is expected to do what, when. Those familiar with the ways of Washington say that the request reflects an unusual bipartisan consensus on the importance of the initiative.

    “In today's political atmosphere, this is very unusual,” says Jack Dongarra, a computer scientist at the University of Tennessee, Knoxville, who closely follows national and international high-performance computing trends. “It shows how critical it really is and the threat perceived of the U.S. losing its dominance in the field.” The threat is real: Japan and China have built and operate the three most powerful supercomputers in the world.

    The rest of the world also hopes that their efforts will make them less dependent on U.S. technology. Of today's top 500 supercomputers, the vast majority were built using processors from Intel, Advanced Micro Devices (AMD), and NVIDIA, all U.S.-based companies. But that's beginning to change, at least at the top. Japan's K machine is built using specially designed processors from Fujitsu, a Japanese company. China, which had no supercomputers in the Top500 List in 2000, now has five petascale machines and is building another with processors made by a Chinese company. And an E.U. research effort plans to use ARM processing chips made by a U.K. company.

    Getting over the bumps

    Although bigger and faster, supercomputers aren't fundamentally different from our desktops and laptops, all of which rely on the same sorts of specialized components. Computer processors serve as the brains that carry out logical functions, such as adding two numbers together or sending a bit of data to a location where it is needed. Memory chips, by contrast, hold data for safekeeping for later use. A network of wires connects processors and memory and allows data to flow where and when they are needed.

    For decades, the primary way of improving computers was creating chips with ever smaller and faster circuitry. This increased the processor's frequency, allowing it to churn through tasks at a faster clip. Through the 1990s, chipmakers steadily boosted the frequency of chips. But the improvements came at a price: The power demanded by a processor is proportional to its frequency cubed. So doubling a processor's frequency requires an eightfold increase in power.

    New king.

    Japan has the fastest machine (bar), although the United States still has the most petascale computers (number in parentheses).

    On the rise.

    The gap in available supercomputing capacity between the United States and the rest of the world has narrowed, with China gaining the most ground.


    With the rise of mobile computing, chipmakers couldn't raise power demands beyond what batteries could store. So about 10 years ago, chip manufacturers began placing multiple processing “cores” side by side on single chips. This arrangement meant that only twice the power was needed to double a chip's performance.

    This trend swept through the world of supercomputers. Those with single souped-up processors gave way to today's “parallel” machines that couple vast numbers of off-the-shelf commercial processors together. This move to parallel computing “was a huge, disruptive change,” says Robert Lucas, an electrical engineer at the University of Southern California's Information Sciences Institute in Los Angeles.

    Hardware makers and software designers had to learn how to split problems apart, send individual pieces to different processors, synchronize the results, and synthesize the final ensemble. Today's top machine—Japan's “K Computer”—has 705,000 cores. If the trend continues, an exascale computer would have between 100 million and 1 billion processors.

    But simply scaling up today's models won't work. “Business as usual will not get us to the exascale,” Simon says. “These computers are becoming so complicated that a number of issues have come up that were not there before,” Rosner agrees.

    The biggest issue relates to a supercomputer's overall power use. The largest supercomputers today use about 10 megawatts (MW) of power, enough to power 10,000 homes. If the current trend of power use continues, an exascale supercomputer would require 200 MW. “It would take a nuclear power reactor to run it,” Shank says.

    Even if that much power were available, the cost would be prohibitive. At $1 million per megawatt per year, the electricity to run an exascale machine would cost $200 million annually. “That's a non-starter,” Shank says. So the current target is a machine that draws 20 MW at most. Even that goal will require a 300-fold improvement in flops per watt over today's technology.

    Ideas for getting to these low-power chips are already circulating. One would make use of different types of specialized cores. Today's top-of-the-line supercomputers already combine conventional processor chips, known as CPUs, with an alternative version called graphical processing units (GPUs), which are very fast at certain types of calculations. Chip manufacturers are now looking at going from “multicore” chips with four or eight cores to “many-core” chips, each containing potentially hundreds of CPU and GPU cores, allowing them to assign different calculations to specialized processors. That change is expected to make the overall chips more energy efficient. Intel, AMD, and other chip manufacturers have already announced plans to make hybrid many-core chips.

    Another stumbling block is memory. As the number of processors in a supercomputer skyrockets, so, too, does the need to add memory to feed bits of data to the processors. Yet, over the next few years, memory manufacturers are not projected to increase the storage density of their chips fast enough to keep up with the performance gains of processors. Supercomputer makers can get around this by adding additional memory modules. But that's threatening to drive costs too high, Simon says.

    Even if researchers could afford to add more memory modules, that still won't solve matters. Moving ever-growing streams of data back and forth to processors is already creating a backup for processors that can dramatically slow a computer's performance. Today's supercomputers use 70% of their power to move bits of data around from one place to another.

    One potential solution would stack memory chips on top of one another and run communication and power lines vertically through the stack. This more-compact architecture would require fewer steps to route data. Another approach would stack memory chips atop processors to minimize the distance bits need to travel.

    A third issue is errors. Modern processors compute with stunning accuracy, but they aren't perfect. The average processor will produce one error per year, as a thermal fluctuation or a random electrical spike flips a bit of data from one value to another.

    Such errors are relatively easy to ferret out when the number of processors is low. But it gets much harder when 100 million to 1 billion processors are involved. And increasing complexity produces additional software errors as well. One possible solution is to have the supercomputer crunch different problems multiple times and “vote” for the most common solution. But that creates a new problem. “How can I do this without wasting double or triple the resources?” Lucas asks. “Solving this problem will probably require new circuit designs and algorithms.”

    Finally, there is the challenge of redesigning the software applications themselves, such as a novel climate model or a simulation of a chemical reaction. “Even if we can produce a machine with 1 billion processors, it's not clear that we can write software to use it efficiently,” Lucas says. Current parallel computing machines use a strategy, known as message passing interface, that divides computational problems and parses out the pieces to individual processors, then collects the results. But coordinating all this traffic for millions of processors is becoming a programming nightmare. “There's a huge concern that the programming paradigm will have to change,” Rosner says.

    DOE has already begun laying the groundwork to tackle these and other challenges. Last year it began funding three “co-design” centers, multi-institution cooperatives led by researchers at Los Alamos, Argonne, and Sandia national laboratories. The centers bring together scientific users who write the software code and hardware makers to design complex software and computer architectures that work in the fastest and most energy-efficient manner. It poses a potential clash between scientists who favor openness and hardware companies that normally keep their activities secret for proprietary reasons. “But it's a worthy goal,” agrees Wilfred Pinfold, Intel's director of extreme-scale programming in Hillsboro, Oregon.

    Not so fast.

    Researchers have some ideas on how to overcome barriers to building exascale machines.

    Coming up with the cash

    Solving these challenges will take money, and lots of it. Two years ago, Simon says, DOE officials estimated that creating an exascale computer would cost $3 billion to $4 billion over 10 years. That amount would pay for one exascale computer for classified defense work, one for nonclassified work, and two 100-petaflops machines to work out some of the technology along the way.

    Those projections assumed that Congress would deliver a promised 10-year doubling of the budget of DOE's Office of Science. But those assumptions are “out of the window,” Simon says, replaced by the more likely scenario of budget cuts as Congress tries to reduce overall federal spending.

    Given that bleak fiscal picture, DOE officials must decide how aggressively they want to pursue an exascale computer. “What's the right balance of being aggressive to maintain a leadership position and having the plan sent back to the drawing board by [the Office of Management and Budget]?” Simon asks. “I'm curious to see.” DOE's strategic plan, due out next month, should provide some answers.

    The rest of the world faces a similar juggling act. China, Japan, the European Union, Russia, and India all have given indications that they hope to build an exascale computer within the next decade. Although none has released detailed plans, each will need to find the necessary resources despite these tight fiscal times.

    The victor will reap more than scientific glory. Companies use 57% of the computing time on the machines on the Top500 List, looking to speed product design and gain other competitive advantages, Dongarra says. So government officials see exascale computing as giving their industries a leg up. That's particularly true for chip companies that plan to use exascale designs to improve future commodity electronics. “It will have dividends all the way down to the laptop,” says Peter Beckman, who directs the Exascale Technology and Computing Initiative at Argonne National Laboratory in Illinois.

    The race to provide the hardware needed for exascale computing “will be extremely competitive,” Beckman predicts, and developing software and networking technology will be equally important, according to Dongarra. Even so, many observers think that the U.S. track record and the current alignment of its political and scientific forces makes it America's race to lose.

    Whatever happens, U.S. scientists are unlikely to be blindsided. The task of building the world's first exascale computer is so complex, Simon says, that it will be nearly impossible for a potential winner to hide in the shadows and come out of nowhere to claim the prize.

  5. Geoscience

    Ferreting Out the Hidden Cracks in the Heart of a Continent

    1. Naomi Lubick*

    Deep beneath the central United States, researchers find signs of buried faults that have triggered earthquakes in the past—and may still be kicking.

    Two hundred years ago, large earthquakes shook the central United States, near what was then the tiny town of New Madrid, Missouri. The shaking on 16 December 1811 was felt all the way to the East Coast. Two more earthquakes followed in January and February. Magnitude estimates for the largest quake have ranged from 6.8 to as high as 8.

    Connect the dots.

    Buried faults in the U.S. interior extend far outside the New Madrid Seismic Zone (oval patch at top).


    Today, the so-called New Madrid seismic zone continues to crackle with tiny earthquakes. But now, 4 million people live in the region, heightening the importance of answering the question of what could have triggered the recorded quakes—and whether and where big ones are likely to strike again. Looking beyond the faults thought to be responsible for the 1811–12 events, seismologists and geologists are discovering evidence for faults outside the active zone that may be less predictable than anyone suspected.

    A network of ancient faults could be influencing the shaking in the central United States, says M. Beatrice Magnani, a geophysicist at the Center for Earthquake Research and Information (CERI) at the University of Memphis in Tennessee. Last month at the American Geophysical Union meeting in San Francisco, Magnani reported new data showing how buried faults both inside and outside the New Madrid Seismic Zone have warped the continental crust there over millions of years. That means “the seismogenic zone might be larger than illuminated by modern seismicity,” she says—and that unrecorded earthquakes may have rocked now-quiet regions in the interior.

    The hidden faults formed hundreds of millions of years ago, when the tectonic plate under the then–Atlantic Ocean slid beneath eastern North America—much as plates are doing today in Japan, Chile, Haiti, and elsewhere. The plate reversed direction, leaving parts of Europe and Africa stuck to North America and raising the mountains whose eroded stumps are now Appalachia. Today the ancient faults of the “failed rift” remain, buried tens of kilometers beneath the American Midwest and Eastern Seaboard. Nobody knows exactly where those faults are or how they connect and interact.

    To image the faults in the region around New Madrid, Magnani and her colleagues have taken ocean-observing techniques to the Mississippi River—“seismology à la Mark Twain,” she calls it. In research cruises in 2008, 2010, and 2011, the team sent sonic waves through the Mississippi River and the kilometer-deep lens of sediments it has deposited. The sound waves reflected off different layers to create seismic profiles that reveal where the layers have been warped or squished by earthquakes and other tectonic forces.

    The team found faults with “indistinguishable” profiles both within and outside the New Madrid Seismic Zone, Magnani says. In some places, the warping increases in the older rocks, which are millions of years old. Some faults also deform the young sediments above—evidence of more recent earthquakes and fault movements in the past 10,000 years or so. Higher resolution profiles, only tens of meters deep, show gently folded layers on one fault, hundreds of kilometers south of New Madrid. The folds, Magnani said at the meeting, most likely formed when sediment particles shook like liquid—a phenomenon called liquefaction—during an estimated magnitude-7.6 quake of unknown date.

    These profiles suggest that a 500-million-year-old plate boundary south of New Madrid remains active, Magnani says. Other evidence—gathered by Martitia Tuttle, an independent consulting geologist, and others who have dated liquefaction features—suggests that another fault of the same age, which crosses from Alabama into Oklahoma, has had at least three major seismically active periods in the past 10,000 years. But Magnani says her team's new data show that the nearby Ouachita fault remained quiet for reasons still unclear.

    In the New Madrid Seismic Zone, geoscientist Seth Stein of Northwestern University in Evanston, Illinois, and colleagues have argued that faults in inner continental settings can turn off and on as built-up stress shifts from one fault to another, lowering the risk of earthquakes in one place but raising it in another. As a result, Stein thinks, the risk of more big quakes in the New Madrid region is much lower than researchers have assumed. Magnani says her team's findings confirm some of Stein's ideas. “But that doesn't necessarily mean we should lower the seismic hazard in the central United States—actually, quite the opposite,” Magnani says: The discovery of unseen faults outside of New Madrid's obvious seismic zone may mean more danger, not less.

    Elsewhere in the central and eastern continent, modern earthquakes have revealed seismic hot spots in places such as Charleston, South Carolina; central Indiana; and Virginia, where a magnitude-5.8 quake struck in August 2011. But in quiet areas in between, ancient faults may still lurk undiscovered. The earthquake catalog is incomplete, says Susan Hough, a seismologist at the U.S. Geological Survey in Pasadena, California, who has estimated magnitudes of the historic New Madrid cycle using shaking-intensity reports.

    At the meeting, Hough said findings by Magnani's team make sense if stress released along old faults caused deformation or earthquakes in previously unexpected places. “Let others pop off elsewhere, and it all begins to hang together,” Hough says.

    • * Naomi Lubick is a freelance writer based in Stockholm.

  6. Behavioral Sciences

    Modernizing an Academic Monastery

    1. Greg Miller

    The venerable Center for Advanced Study in the Behavioral Sciences has been trying to reinvent itself by applying behavioral science to 21st century problems.

    Intellectual haven.

    The Center for Advanced Study in the Behavioral Sciences at Stanford has provided a quiet environment for scholars to think and write since 1954. The center's library (center) houses hundreds of books written by past fellows. Stephen Kosslyn, the center's new director, wants the center to engage more with real-world problems.


    Thomas Kuhn wrote much of his landmark 1962 book, The Structure of Scientific Revolutions, at a secluded retreat in the foothills above Palo Alto, California. In the 1970s, future Nobelist Daniel Kahneman spent time here in the formative days of the field that came to be known as behavioral economics. For decades after the Ford Foundation started it in 1954, behavioral and social scientists coveted an invitation to the exclusive Center for Advanced Study in the Behavioral Sciences. The center's yearlong fellowships offered leading scholars freedom from teaching and other academic obligations, as well as a quiet place to reflect and write. Those who came produced seminal works in fields as diverse as political science and primatology.

    But in recent years, the center seems to many observers to have lost some of its luster, attracting fewer big-name scholars and producing fewer high-impact works. In part because of financial issues, nearby Stanford University took over the once-independent center in 2008. “The decision the trustees faced is do they just want to let it limp along, dissolve it, or try something new,” says Stephen Kosslyn, the center's current director.

    They opted for something new and recruited Kosslyn to shake things up. A distinguished psychologist, he left his post as dean of social sciences at Harvard University to become the director of the center in January 2011. “The culture here was that of a monastery,” Kosslyn says. He wants to open the center up more to the outside world and emphasize real-world problem solving over heady academic ruminations. He has proposed, among other things, setting up networks of researchers to examine specific issues, from how to make technology more accessible to elderly people to documenting psychological obstacles to peace in the Middle East. “We want to make behavioral science relevant,” he says.

    Food for thought

    Kosslyn is soft-spoken and unassuming, and he looks every bit the professor in round wire-rimmed glasses and a tweed jacket. At the same time, he's proud to say he saw the Grateful Dead perform on campus when he was a Stanford undergraduate in the 1970s, and he keeps an electric bass in his office. He plays whenever he can find a group of musicians with similar skills and tastes (mainly classic rock). He and his wife chose to live in San Francisco, more than 50 kilometers away from the center, because they were turned off by what Kosslyn describes as the materialism and anti-intellectualism of Silicon Valley, whose office parks and suburbs sprawl out below the center's hillside perch. “I don't like the culture,” he says.

    Several of his colleagues say Kosslyn is well-suited to revive the center. “He's a person of tremendous energy,” says Jonathan Cole, a sociologist at Columbia University and chair of the center's board of directors. “He's willing to take risks and experiment,” Cole says. Harvard psychologist Steven Pinker credits Kosslyn, his former graduate adviser, with breathing new life into another venerable institution. “The Harvard University psychology department had been a backwater, coasting on its reputation, before Steve reinvigorated it in the 1990s with an aggressive program of hiring young mid-career scientists, which vaulted the department into the front ranks,” says Pinker, who joined the department in 2003. Pinker also cites Kosslyn's leadership through the financial crisis as evidence that he's the right person to turn around the Stanford center: “I'd be surprised if Steve didn't make the center financially viable and intellectually vibrant within a few years.”

    The world was a different place when the center was founded, at the midpoint of a century that had brought two world wars, the Holocaust, and the Great Depression. And then there was the Cold War. A San Francisco Chronicle article about the center's launch raised the specter of mind-control methods presumably under development in the Soviet Union. “This could be a weapon of great power in Communist hands, unless comparable advances in the West produce effective countermeasures,” the article reads, apparently quoting from a statement from a group of social scientists convened by then–Vice President Richard Nixon.

    The center's founders realized that solutions to societal problems could come from the social and behavioral sciences, but they also realized how poorly developed these disciplines were, says Robert Scott, a former deputy director of the center and its de facto historian. One early adviser, the Austrian-born sociologist Paul Lazarsfeld, proposed that the center be based on the European model, in which promising young students learn at the feet of the masters, Scott says. That model didn't last long. “It was very un-American,” Scott says with a chuckle.

    Instead, a more egalitarian culture quickly emerged. Despite its somewhat isolated setting, the center was designed to foster informal exchanges among its 50 or so resident fellows, Cole says. Cozy, low-slung buildings house private offices for each fellow, but their arrangement around several courtyards forces a certain amount of walking between buildings, increasing the likelihood of chance encounters. Lunch is not left to chance, however: Fellows were, and still are, expected to attend lunch each day, catered by the center's private chef. “Where you eat you tend to talk, and where you talk you get ideas,” Cole says.

    The center has no permanent faculty or students and no laboratories. In the early days, candidates were nominated by past fellows and other academics and ultimately selected by the center's board. For the chosen ones, an invitation would suddenly arrive out of the blue. “It was considered a huge honor,” Cole says. Alumni include 22 Nobel Prize winners, 10 Pulitzer Prize winners, 44 MacArthur Fellows, and 128 current members of the National Academy of Sciences. “The simplest way to describe what the center then offered was what most people who went into academia thought they were getting into and never found—that is, a genuine community of intensely interacting, smart, interesting people from diverse fields,” Scott says.

    Paradigm shift

    “The center was a phenomenal place in its time for a select group of people who had a rare kind of experience that couldn't be gotten anywhere else,” Cole says. “But that's not the world we live in today.” In one nod to modernity, the old-boys' network (literally, for the most part) that hand-selected fellows was replaced years ago by a competitive application process. Nearly half of the members of this year's class are women. But other changes have reduced the center's magnetic pull, Cole says. The Internet has made far-flung collaborations easier, for example, and more researchers today have spouses with professional careers, making it difficult for some prospective fellows to move to Palo Alto for a year. (Fellows typically visit the center on sabbatical from their home institutions, which continue to pay at least part of their salaries.)

    Then came the financial crisis. By 2008, the returns on the center's endowment were barely keeping up with its growing operating costs, and the board agreed to let Stanford take over the center's finances and administration. The center's endowment was folded into Stanford's much larger fund just before the stock market crashed. Making matters worse, foundations that once contributed to the center have tightened the purse strings. “They're not interested in paying for people to come here for a few months and maybe write a book, and I don't blame them,” Kosslyn says.

    Kosslyn thinks shifting the focus to real-world applications of behavioral science will make the center more appealing to donors. One way he wants to do this is by creating “impact networks” of six to eight scholars, including resident fellows and outside collaborators, to work on specific problems. One proposed network, for example, would study how to design software for elderly people that accounts for age-related changes in vision, memory, and coordination. Another team would identify the daily grievances that most foster resentment and mistrust between Palestinians and Israelis and obstruct the path to peace. Kosslyn hopes to fund these and a handful of other projects he's proposed through a combination of federal grants and donations from foundations and individual donors.

    Kosslyn also hopes to create “center institutes,” each headed by a resident scholar, that would tackle broader issues. His ideas for these include a war crimes archive and a center on personalized data mining that would examine how search engine and other companies that collect personal data online could use it to benefit the people who provided it. Kosslyn envisions these centers generating specific questions to be tackled by future “impact networks” of scholars.

    At the same time, Kosslyn says he wants to preserve and encourage the interdisciplinary discourse the center has always provided. One day last fall, fellows gathered for a panel discussion, part of a new series Kosslyn started to foster crosstalk among disciplines. The topic was culture and economics, and the panel comprised two Stanford economists, Peter Reiss and Frank Wolak, and Jing Tsu, who studies Chinese language and culture at Yale University. They discussed how to define culture and determine its influence on economic activity (and vice versa) and whether cultural scholars might use tools from economics to quantify culture.

    Kosslyn realizes that preserving the center's idyllic environment and getting scholars talking to each other will be the easy part. His plans to establish the center as a hub for translational social science research will require raising money in extremely difficult times. But he says the amounts are fairly modest, perhaps $200,000 per year for an impact network and $300,000 per year for a center institute. He thinks he can convince potential donors that it would be money well spent. The line between philanthropy and investment is getting blurrier, Kosslyn says, and the center has to prove that it can give something back. Its future may depend on it.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution

Navigate This Article