News this Week

Science  18 May 2012:
Vol. 336, Issue 6083, pp. 786

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Around the World

    1 - Washington, D.C.
    Senate Bill Would Preserve Helium Supply for Research
    2 - Washington, D.C.
    Summit Generates Global Principles for Peer Review
    3 - Brussels
    European Parliament Rebukes Regulatory Agencies
    4 - Mongstad, Norway
    Norway Unveils Carbon Capture Plant
    5 - Shanghai, China
    U.S. Tweets Chinese Air Quality

    Washington, D.C.

    Senate Bill Would Preserve Helium Supply for Research


    A new bill would alter the ongoing sales of the United States' reserve of helium, an element used in the manufacture of semiconductors and microchips and essential for cooling experiments to near absolute zero. The proposed Helium Stewardship Act of 2012 would ensure a supply of helium for science through about 2030 by slowing sales of the reserve, held near Amarillo, Texas (pictured); requiring the federal government to charge the market price for helium; and holding back the last 85 billion liters of the gas for federal users, including holders of research grants. The bill would also avert an acute shortage by continuing the sales past next year, when the 1996 law authorizing them will expire. The sales began in 2003 and supply 30% of the world's helium.

    Moses Chan, a physicist at Pennsylvania State University, University Park, welcomes the bill, which was proposed by Senator Jeff Bingaman (D-NM). “They are saying that they are not getting out of the helium business” after next year, says Chan, who served on a 2010 study by the National Academies' National Research Council that criticized the current fixed-price sales.

    Washington, D.C.

    Summit Generates Global Principles for Peer Review

    A group of research administrators from 44 countries has issued the first-ever international principles governing peer review of grant proposals. Meeting this week at the National Science Foundation, the heads of research agencies from around the world pledged to use experts with the appropriate knowledge, disclose in advance any potential conflicts of interest, maintain the confidentiality of a proposal's content, provide applicants with relevant feedback, and make their rules transparent to the community.

    Participants also announced the formation of a Global Research Council, a virtual entity that will tackle important problems affecting funding agencies around the world. Regional meetings will provide input for an agreement to be unveiled at an annual gathering held at a different site each year. Next year's meeting will be sponsored by Germany and Brazil, for example, and will take place in May in Berlin.


    European Parliament Rebukes Regulatory Agencies

    Technology Centre Mongstad


    The European Parliament on 10 May post-poned signing off on the 2010 expenditures of three science-related agencies amid concerns over conflicts of interest and inappropriate spending.

    The vote to postpone the so-called discharge—which affects the European Food Safety Authority (EFSA), the European Medicines Agency, and the European Environment Agency—has no immediate consequences, but it is seen as a warning that the agencies need to restore their credibility. It came 2 days after EFSA's head, Diána Bánáti, resigned because she had returned to her job at an industry-funded group.

    The agencies, which provide the European Commission with scientific advice, have been under fire from members of the European Parliament for several years for being too close to industry and other parties. Members have used their right to withhold approval of past spending as a way to express their discontent and put pressure on the agencies.

    The European Parliament will vote on the three budgets again in October; it has asked the agencies to come up with stricter and more transparent conflict-of-interest policies first.

    Mongstad, Norway

    Norway Unveils Carbon Capture Plant

    Norway opened the doors to the world's largest carbon capture and storage (CCS) test facility on 7 May. The project, called the Technology Centre Mongstad (TCM), is a joint venture between the Norwegian government, Norwegian energy company Statoil, South Africa's Sasol, and Shell. At $1 billion, TCM's price tag is 10 times the original cost estimate.

    TCM's aim is to test, verify, and demonstrate carbon-capture technologies, which currently are expensive and consume a lot of power. The center will test two different liquid-based post-combustion technologies, each designed to filter out as much as 85% of the carbon dioxide in flue gas. One uses chilled ammonia and the other uses an amine solvent to trap the gas. The plant's operators will experiment with different concentrations of carbon dioxide and different flow rates, using the center's more than 4000 instruments to monitor the tests.

    The TCM test center can't yet store the carbon that it captures, however. The fullscale carbon capture center, including storage, was originally scheduled to be online in 2014. It has been delayed to 2018.

    Shanghai, China

    U.S. Tweets Chinese Air Quality

    A new Twitter feed run by the Shanghai U.S. consulate is charting the city's air quality—and so far the prognosis is not good. On 12 May, the feed, @CGShanghaiAir, began reporting hourly readings of concentrations of harmful particulate matter less than 2.5 micrometers in diameter (PM2.5), as measured inside the consulate's compound. The bulk of tweets in the feed's first few days reported air quality index levels from 100 to 200—which the U.S. Environmental Protection Agency classifies as “unhealthy for sensitive groups” or “unhealthy.”

    The consulate's reports conflict with those by the Chinese government, which have largely focused on measurements of PM10 (Science, 1 August 2008, p. 636) and the new feed could rile Chinese officials, who protested similar air quality tweets by the U.S. Embassy in Beijing in 2009. After locals disseminated the Beijing embassy results on Chinese microblogs, China took some steps in January toward openness. But more accurate figures won't fix the air itself. And that, as @BeijingAir reported on a day when index levels exceeded 500, can be “crazy bad.”

  2. Random Sample

    Gene Flow Makes for Copycat Color Patterns


    This colorful butterfly is one of hundreds of subspecies of the 43-species genus Heliconius, which has varied color patterns and is considered a classic example of mimicry used to ward off potential predators.

    An analysis of the DNA sequence from three Heliconius species and several subspecies reveals that those with the same color patterns have the same versions of key genes—holdovers from extensive hybridization within the genus, the authors report this week in Nature. Hybridization is thought to be detrimental, as offspring are often less fit, but the acquisition of the right color patterning genes proved advantageous enough that the genes were retained, says Adriana Briscoe, an evolutionary biologist at the University of California, Irvine.

    The Heliconius genome sequencing project also revealed much greater than expected conservation between moth and butterfly genomes. The butterfly also has an impressive array of olfactory receptor genes. Until now, butterflies were not thought to rely much on a sense of smell, unlike moths, which depend heavily on long-distance odors. But that difference may not hold up; instead, moths may really be “butterflies of the night,” says Briscoe.


    Join us Thursday, 24 May, at 3 p.m. EDT for a live chat on nanomedicine.

    By the Numbers

    64% — Percentage of global deaths of children under 5 caused by preventable infectious diseases in 2010, according to a study published on 9 May by The Lancet.

    3.6% — The percentage of adults in the United States who sleepwalk, according to a study published on 14 May in Neurology. The study looked at 19,000 adults in 15 states.

    1,000,000 — Number of pages now contained in the online biodiversity resource Encyclopedia of Life, which reached the milestone on 9 May.

    New Candidate for World's Oldest Cave Art


    Since their discovery in 1994, the 37,000-year-old paintings of lions, rhinos, and other animals in southern France's Chauvet Cave have been the oldest known cave art. Now a team working at the Abri Castanet, a cave in southern France's Vezere valley, claims to have discovered evidence of artistic activity that may be older than Chauvet.

    Working in the cave, archaeologist Randall White of New York University in New York City and his team found limestone blocks adorned with engravings, including one the team interprets as female genitalia. The blocks were impossible to date, but in 2007, the team excavated another fallen block engraved with a vulva-like image; the block rested directly on a segment of cave once occupied by humans and littered with animal bones that dated to between 36,000 and 37,000 years ago.

    That would mean that the works of art at Abri Castanet are at least as old as those at Chauvet, White and colleagues conclude in a paper published online 14 May in the Proceedings of the National Academy of Sciences.

    Mastering Clairvoyance

    A recent master's degree thesis at European University Viadrina Frankfurt (Oder) in Germany is generating waves—though not the “time waves” the student was looking for.

    Orthopedist Peter Conrad had set out to study a hollow aluminum cylinder known as a Kozyrev mirror, which proponents claim opens a spacetime channel that enables a person to communicate telepathically, see aliens and UFOs, and even contact the dead.

    In what thesis supervisor Harald Walach of the Institute for Transcultural Health Studies at Oder calls a “smart experiment,” Conrad tested whether the mirror also facilitated clairvoyance. Test subjects held on to wires either connected to a small Kozyrev mirror or to a mockup. A piece of paper with a number written on it was placed in the cylinders. In one set of experiments, he found, people connected to a Kozyrev mirror were significantly better than average at guessing the numbers.

    The work won Conrad his master's degree last year—but some scientists see it as part of a worrying trend of pseudoscience finding a home in publicly funded universities. Responding to those critics, Walach wrote in a statement: “This is an exploratory finding, that can be seen as a sign for clairvoyance.”

    Walach's institute has sparked debate before when it included energy medicine, faith healing, and other esoteric practices on its curriculum in 2010. Those plans were later scrapped. But there have been similar situations at other universities in recent years. “It appears that such nonsense clusters at certain institutes and around certain professors,” says German-born Edzard Ernst, professor of complementary medicine at the University of Exeter in the United Kingdom. “It is regrettable and alarming when German universities produce such hogwash disguised as science.”

  3. Newsmakers

    Senior DOE Official Resigns

    Arunava Majumdar, the head of the Department of Energy's Advanced Research Projects Agency-Energy (ARPA-E), will leave his post 9 June, Energy Secretary Steven Chu wrote in a 9 May e-mail to agency staff members. Biochemist Eric Toone, a former professor at Duke University in Durham, North Carolina, and the deputy director of technology for ARPA-E, will become the new head.



    Majumdar's departure “is a kick in the stomach,” but Toone “will keep the agency in good hands,” says Barton Gordon, a former member of the U.S. House of Representatives who spearheaded the creation of ARPA-E in 2007. Gordon, now a lobbyist with K&L Gates in Washington, D.C., says Majumdar is a “good scientist and a good organizer who created a good bipartisan following [in Congress] for ARPA-E.”

    Modeled after the Pentagon's Defense Advanced Research Projects Agency, ARPA-E is designed to funnel money quickly to high-risk, “transformational” efforts to develop new energy technologies. The agency has won broad backing from industry, and has fared relatively well in the annual budget battles. This year it will spend about $300 million on a wide array of projects, down from a high of $400 million in 2010.

    They Said It

    “Industry has used this study in ways that are improper and untruthful.”

    —Fire protection engineer Vytenis Babrauskas of Fire Science and Technology Inc. in Issaquah, Washington, commenting to the Chicago Tribune on 9 May about how chemicals manufacturers have allegedly misrepresented a study he conducted 25 years ago while he was at the then-National Bureau of Standards. The study examined whether flame retardants added to furniture and other household objects could slow fires.

  4. Archaeology

    Near Eastern Archaeology Works to Dig Out of a Crisis

    1. Andrew Lawler

    In the wake of the Arab Spring, archaeologists in the Near East are locked in a struggle for the survival of their field.

    New digs.

    Many regions of the Near East are closed to archaeologists, but new areas, such as this dig at Abu Tbeirah in southern Iraq, are opening up.


    WARSAW—The day he left Switzerland for Syria in spring 2011, archaeologist Oskar Kaelin of the University of Basel received a warning from the Swiss foreign ministry about deteriorating political conditions there. He decided to go anyway. It was only his second season leading a team digging a palace left by the mysterious Mitanni Empire, which ruled a large part of the Near East around 1400 B.C.E. “We had the funding, and our luggage was already packed,” Kaelin recalls. The team members spent a fruitful 2 months excavating, and when they left in early June, they were among the last foreigners on Syrian soil. They left behind not just their site but also artifacts and samples still awaiting analysis. Kaelin doesn't know when—or if—he can return to complete his research.

    The Near East was the birthplace of farming, animal domestication, cities, empires, and writing, and so it exerts a powerful pull on archaeologists. The region is also the birthplace of modern archaeology itself, which began in the 19th century amid the mounds of Mesopotamia. Yet at the field's first gathering* since the Arab Spring unleashed unrest from Morocco to Oman, researchers were worried about the future of archaeology's flagship subdiscipline. More than 120 foreign teams were abruptly shut out of Syria, Egypt is taking a xenophobic turn, parts of Iraq remain prone to violence, and Iran remains virtually sealed off. Even peaceful countries are more difficult for foreign archaeologists to access.

    Although about 600 scientists crowded the halls here to listen to nearly 500 presentations, there was a note of quiet desperation in the air. “We're still in shock,” says Daniele Morandi Bonacossi of the University of Udine in Italy, who has centered his professional life on the ancient western Syrian capital of Qatna, to which he cannot return for the foreseeable future. “It's getting hard to be a Near Eastern archaeologist,” he says. Frank Hole, an emeritus Yale University anthropologist with more than 40 years of experience in the region, agrees: “The discipline is in crisis.”

    Lost city.

    Bonacossi spent years digging here at Qatna in Syria but now cannot access his site or its finds.


    That crisis is due not just to the immediate upheavals. After 3 decades of political instability, grant money is drying up, new students are wary of entering the field, and retiring professors are often not being replaced. But resourceful researchers are finding new places and ways to gather their data. Some have started work in a few stable and long-neglected pockets in the region, such as Iraq's Kurdistan. Others are pioneering advanced remote-sensing techniques, as well as DNA, isotopic, and other analyses to squeeze out more information from the material they can access. As a result, some see a silver lining in the current crisis. “This may prove a moment of promise,” says Jason Ur, a Harvard University archaeologist.

    Modern archaeology has its roots deep in the dry soil of the Middle East. However, Iran's 1979 revolution, its subsequent war with Iraq, and two gulf wars ended the era of foreign-dominated teams conducting large digs in those two countries. Archaeologists shifted to Syria, Jordan, and Turkey. But Jordan lacks an extensive urban legacy, and Turkey has become increasingly reluctant to grant foreigners dig permits. That makes Syria's closure a hard blow. “Now we are all trying to find other places,” Kaelin says. “Everyone will go to Kurdistan.”

    Kurdistan, an autonomous region about the size of Belgium in Iraq's north and east, was long off-limits to outsiders under the regime of Saddam Hussein. But today it welcomes foreign archaeologists and has relatively stable politics. “There has been no systematic research carried out here for the last 80 years,” says Marta Luciani, an archaeologist at the University of Vienna who has long worked in Syria. Last year, her team mapped more than 50 previously unknown sites; one covers 5 hectares and is littered with pottery dating from 6000 B.C.E. to Islamic times.

    Kurdistan's prehistory goes back even further. In the 1950s, archaeologists here found 40,000-year-old Neandertals buried in Shanidar Cave with what some argued were funeral rituals; the University of Cambridge in the United Kingdom is in talks to reopen the excavations, says university archaeologist Graeme Barker.

    Kurdistan lies east of the oldest known Neolithic settlements, but researchers say sites in Kurdistan may offer clues to the transition to farming and settled life between about 10,000 B.C.E. and 7000 B.C.E. During initial excavations in the western Zagros Mountains in Kurdistan this March and April, for example, Roger Matthews of the University of Reading in the United Kingdom and colleagues found ancient grinding stones, hearths, architecture, and obsidian that may have come from as far as 200 kilometers away. Kurdistan is also the location of a main road linking Assyria with Babylon, two of the region's big powers in the first 3 millennia B.C.E. Harvard's Ur, who previously worked in Iran and Syria, hopes to probe two provincial Neo-Assyrian cities dating to about 800 B.C.E. and their hinterlands to understand how this great empire organized its cities, populations, and sophisticated waterworks.

    Closed borders.

    Turmoil across the Middle East is making it tougher for foreign archaeologists to study early agriculture and urban development.


    “Everyone is scrambling to get in on the ground floor” of work in Kurdistan, says Tina Greenfield, an archaeology Ph.D. candidate at the University of Cambridge. She is analyzing animal bones at ancient sites in Kurdistan but cannot cross the internal Iraqi border to the city of Mosul to study a bone collection there. But some archaeologists believe that the Kurdish promise is overblown. “It's the new hype,” says Peter Pfälzner of the University of Tübingen in Germany, who worked in Syria.

    Other alternatives for archaeologists are Persian Gulf states such as Oman and Kuwait (Science, 17 February, p. 790) and the Caucasus region of the former Soviet Union. Sandwiched between the Black and Caspian seas, the Caucasus has a bloody modern history that has kept it mostly off-limits, but Georgia and Azerbaijan now welcome digs.

    In 2009, at a site called Kamiltepe in Azerbaijan, Barbara Helwing of the German Archaeological Institute in Berlin and her Azerbaijani colleagues found a large, round, brick platform dating to a surprisingly early 6000 B.C.E. The 24-meter-wide, 2.5-meter-high platform shows that “people are building monumental structures and transforming their lives” before archaeologists had expected such complexity, Helwing says. European teams are also starting work in neighboring Georgia, focusing on Neolithic connections among this region, the steppes to the north, and the emerging urban cultures of the south.

    Other researchers are applying new tools to old samples. Heat, humidity, and poor bone preservation have made it extremely difficult to extract ancient DNA from Near Eastern bones, essentially leaving the discipline out of the genetic revolution. But at the meeting, biologist Henryk Witas of the University of Łódź in Poland presented preliminary evidence of ancient mitochondrial DNA from human teeth from a half-dozen skeletons at two sites in eastern Syria dated to various times in the 3rd millennium B.C.E. Most of the DNA was related to haplotype group M, which is not found in people living in the Middle East today but is common among those now living in northern Pakistan, India, and Tibet. Witas concluded that people migrated from the northern part of the Indian subcontinent along trade routes to the west as early as 2500 B.C.E.

    This surprising conclusion was hotly disputed by others, who suspect that the M group once existed in the Near East but has been diluted since. “There is no archaeological evidence of Central Asian migration” before medieval times, notes archaeologist Maria Grazia Masetti-Rouault of the Sorbonne University in Paris, who excavated the Syrian sites. “It is way too premature to make any conclusions from this,” adds Reinhard Bernbeck of the Free University in Berlin. But the paper demonstrated that, here as elsewhere, ancient DNA may serve as a key tool.

    Archaeologists are also making heavier use of remote-sensing data. For example, Jesse Casana, an archaeologist at the University of Arkansas, Fayetteville, who once worked in Syria but now focuses on the United Arab Emirates, is experimenting with an instrument-heavy drone to help in mapping.

    The closure of Syria is only the latest sign of trouble for ancient Near Eastern archaeology. Thanks to decades of lack of access to places like Iraq, prominent U.S. universities such as Harvard, Yale, and the University of Pennsylvania are not replacing retiring professors in the field. “Basically, we don't teach ancient Near Eastern archaeology anymore,” says Yale's Hole. “It isn't viable.” European researchers say they are experiencing a similar trend. “I'm one of the last Near Eastern archaeologists in Switzerland,” adds Basel's Kaelin.

    Others note that without sure places to dig, winning grants is difficult. “Once you lose excavation funding, it is hard to get it back,” worries Udine's Bonacossi. Such uncertainties may create a vicious circle in which attracting new talent is tough. “There is a chilling effect,” Ur says.

    Still, the winds of change in the Middle East aren't all ill-favored for excavators. An American team and an Italian team briefly dug at sites in southern Iraq this winter and spring—the first time in nearly a decade that foreigners had returned to excavate in the heartland of ancient Sumer. During a brief survey last year, archaeologist Carrie Hritz of Pennsylvania State University, University Park, discovered that the ancient Mesopotamian city of Girsu is more than four times as large as researchers had thought; she plans a detailed survey this summer. Hritz and some colleagues of her generation are linking their efforts with the practical needs of Iraqis today, assisting in the search for clean water on the parched landscape.

    Bonacossi remains bullish on the future of a field that has coped before with revolution, civil war, and arguably the world's most complex and explosive politics. “There are countries where we can still dig,” he says. “The current trouble may just be history repeating itself.”

    • * Eighth International Congress o n the Archaeology of the Ancient Near East, 30 April–4 May, University of Warsaw. See

  5. Infectious Diseases

    Can New Chemistry Make a Malaria Drug Plentiful and Cheap?

    1. Kai Kupferschmidt*

    German chemist Peter Seeberger says he has developed a cheaper way to produce a key malaria drug. Now, he's trying to convince the rest of the world.

    HALLE, GERMANY—Television is not Peter Seeberger's natural medium. With several cameras trained on him, the 2-meter-tall chemist looks a little stiff in his dark suit, and he delivers his lines with an earnestness that is slightly at odds with the gee-whiz tone of the science show recorded here. He sounds like he's selling insurance.

    Structural change.

    Seeberger thinks his finding will have a big impact on the global artemisinin market.


    But Seeberger, head of a team of 70 researchers at the Max Planck Institute of Colloids and Interfaces in Potsdam, came to sell an idea: that artemisinin, the world's most important anti-malaria medication, can be produced much more cheaply and easily. In January, Seeberger published a paper outlining how a technique called flow chemistry might make a key step in the drug's production chain more efficient. If the promise comes true, it could be a boon for the global fight against malaria, because the current price, between $0.80 and $1.20 per treatment course, is still a major factor hampering access to artemisinin drugs.

    Some scientists are impressed. “The impact of this is hard to overestimate,” says Jack Newman, one of the founders of Amyris Inc. in Emeryville, California, a biotech company that has played a key role in other recent attempts to make artemisinin cheaper and more abundant. Prashant Yadav, an expert on malaria drug supply chains at the University of Michigan, Ann Arbor, says Seeberger's discovery is a “powerful technology, and it has the potential to increase global access to malaria medicine.”

    But Seeberger's method has yet to prove its mettle. It needs to be scaled up, and he can't say how much prices would come down if it worked. Using it in a large facility would require a massive investment, and so far, nobody has stepped up to the plate. What's more, pharma giant Sanofi will open a brand-new facility later this year to make artemisinin therapies based on Amyris's technology: yeast cells that produce a precursor of the drug. Although Seeberger says his discovery would complement that process, Sanofi says it's too late now to adopt it.

    Plants to pills

    Seeberger, 45, is known as a brilliant and driven researcher who's keen for his chemistry to make a difference—and who likes to try unconventional approaches. For most of his career, he has worked on sugars, or carbohydrates, which play many important roles in nature; they coat organs, cells, and molecules and facilitate all kinds of cellular communication.

    When Seeberger started out as a scientist, carbohydrates were seen as career killers because they were hard to analyze and even harder to synthesize. Defying skepticism, he developed a new technique to string sugar molecules together and built a synthesizer that can churn out specific carbohydrate sequences on demand, much like peptide and nucleotide synthesizers. He has started a company to make the machine commercially available.

    Since he came to the Max Planck Institute in 2008, Seeberger has used his expertise to develop carbohydrate-based vaccines against antibiotic-resistant Staphylococcus aureus, anthrax, plague, and malaria. The first animal studies of some candidate vaccines have been promising, although none has yet reached the stage of clinical studies. “But the artemisinin project is the most important discovery I have ever made in terms of immediate impact,” he says.

    To illustrate his work, he has brought a potted plant that looks a bit like oversized parsley with him to the TV studio. It's sweet wormwood (Artemisia annua), which has been used against fever for centuries in traditional Chinese medicine. In the 1970s, Chinese scientists isolated the active compound, artemisinin, and tested it as a malaria drug. It was the start of a worldwide revolution. Today, co-formulated with older drugs in so-called artemisinin-based combination therapies (ACTs), artemisinin and various close cousins have become the first line of defense against malaria.

    But artemisinin is a difficult medicine to produce. Chemists can synthesize it from scratch, but that's prohibitively expensive. Instead, producers still rely on sweet wormwood, most of it grown in China, which makes artemisinin one of the last major drugs to be harvested from plants. Artemisia usually contains less than 1% artemisinin, however, and it takes almost 18 months from planting the seeds to extracting the compound. Supply has gone up and down, taking prices for drug companies on a roller-coaster ride from $400 to $1100 per kilogram.

    Lower production costs would have a big impact on global health, Yadav says. In government-run clinics in most developing countries, malaria drugs are subsidized by big international donors. Cheaper drugs would mean they could help more patients. And many people in developing countries still rely on private pharmacies, where they often buy other, much cheaper and less effective drugs instead of ACTs. “There, every penny reduction really counts,” Yadav says.

    So far, the yeast cells developed by Amyris—which was co-founded by Jay Keasling of the University of California, Berkeley—seem to provide the best chance of a more stable artemisinin supply. Amyris, which is supported by the Bill and Melinda Gates Foundation, has licensed the technology to Sanofi, which has built a special plant in Garessio, Italy, where it plans to start producing ACTs later this year. But those cells have one drawback: They don't produce the final product, artemisinin, but a precursor, artemisinic acid, that still needs to undergo a chemical reaction.

    Artemisinin's activity depends on its so-called endoperoxide group, a bridge of two oxygen atoms that spans one of the molecule's three rings (see graphic, right). To make artemisinin out of artemisinic acid, chemists need to create the bridge by introducing reactive oxygen molecules called singlet oxygen. This can be done by shining light on “normal” oxygen molecules, a process known as photochemistry.

    In a first step, artemisinic acid is reduced to dihydroartemisinic acid. This product is mixed with oxygen; the light activates the oxygen, which reacts with the acid to produce another precursor. Then, trifluoroacetic acid is added to the mix, cleaving a carbon ring in the molecule, which reacts with the molecular oxygen to produce artemisinin.

    Sanofi plans to do this using so-called batch chemistry, in which a product is created via a series of steps in big vats, including, in this case, a specially developed glass reactor. But Seeberger says photochemistry is not well-suited to this classical approach because the bigger reaction vessels get, the less light they let in, and the less reactive oxygen is produced. Instead, he uses flow chemistry, a technique in which the reactions happen while the chemicals are flowing through a thin tube wrapped around a light source, which dramatically increases the volume in which reactive oxygen is produced.

    “This approach makes a lot of technical sense,” says Frank Gupton, a chemist at Virginia Commonwealth University in Richmond. Flow chemistry is ideally suited for reactions that need light, Gupton says, but chemical engineers have been slow to adopt it: “The chemical culture is really steeped in the batch chemistry mentality.”

    Waste not, want not?

    But what will happen with Seeberger's discovery is still unclear. Sanofi's plant is about to open, and the company isn't going to bet on an entirely new technique that has yet to prove that it can be scaled up. In an e-mail to Science, the company calls Seeberger's solution “a clever approach,” but says that “so far the competitivity of this technique has not been demonstrated.”

    The ideal solution would be if other companies adopt the combination of Amyris's yeast cells and Seeberger's method, Yadav says; “then, the price for the drugs could go down significantly.” But a spokesperson for OneWorld Health, the nonprofit pharmaceutical company that has backed Sanofi's project, says there are no plans to make the yeast cells available to any other party.

    Seeberger says there are other ways to use his technique that don't rely on the production of artemisinic acid by yeast cells at all. Sweet wormwood plants often produce artemisinic acid in higher quantities than artemisinin, and Seeberger's initial idea was to use the artemisinic acid contained in waste, which extractors currently throw away. In a sample from extractors from Madagascar, Seeberger's team found 1% artemisinic acid. To his surprise, it also contained 5% dihydroartemisinic acid. The latter would be an even better starting material, eliminating the first reduction step.

    New route.

    Artemisinin is currently extracted from Artemisia annua plants. Later this year, Sanofi plans to start making the drug from genetically engineered yeast cells that produce the precursor artemisinic acid. Sanofi is using batch chemistry to convert artemisinic acid to artemisinin. Seeberger's flow chemistry process starts with artemisinic acid. The compound is also present in the plants, but this is not currently used.


    But Yadav says the idea of turning waste into medicine is far-fetched. “It is a fairly fragmented market with multiple extractors. You would have to collect the waste from all these small companies and establish a supply system,” he says.

    Another option would be to ask extractors to start harvesting artemisinic or dihydroartemisinic acid directly from the plants, instead of artemisinin. Already, Seeberger is testing varieties of the plant to see how much of each compound they contain. But this would mean that the extraction companies have to make a major shift in their production process, requiring time and technical support from the global community.

    On 19 April, Seeberger invited interested parties to a meeting in Berlin to explore the options. They included representatives of Artemisia growers and extractors, pharmaceutical companies GlaxoSmithKline and Boehringer Ingelheim, as well as the Clinton Foundation, UNITAID, and the German Agency for International Cooperation. (The Bill and Melinda Gates Foundation canceled at the last minute.) None of the funders wanted to discuss the meeting with Science. Seeberger says he was asked many critical questions—“But then the next day, my phone did not stop ringing.” He is now in discussions with several interested parties, he says.

    But for the moment, he is soldiering on by himself. Since January, a new postdoc has taken over the project; by changing the solvent and the temperature and introducing an LED as the light source, he has increased the yield from 40% to 65%. And to start the scale-up, Seeberger has just ordered a new LED 10 times as big as the one he previously used. When it arrives, he wants to build a new reactor that can churn out a kilogram of artemisinin per day. The next reactor would be 10 times bigger again and capable of producing more than 3 tons of artemisinin per year, enough for a few million treatments.

    Drumming up more interest was also one reason he agreed to be on the TV show. “So one day we might be able to say that this experiment has saved millions of lives?” the presenter of the TV show concludes as Seeberger's 5-minute appearance draws to a close.

    “Yes, that is the hope,” Seeberger answers.

    • * Kai Kupferschmidt is a science writer in Berlin.

  6. Parsing Terrorism

    1. Eliot Marshall

    Terrorism research has expanded rapidly since 9/11, shifting its focus from human pathology to the analysis of how rational people interact with violent groups.

    Early reports from a catastrophe on 11 November 1982 in Lebanon's seaside town of Tyre spoke of a suspicious car on the street near a seven-story building. Some people said there was no car. But everyone remembered what happened at about noon: A huge blast ripped through the concrete structure, a base for Israeli forces engaged in a cross-border battle with Palestinians. Scores of Israeli soldiers were killed, along with many other people. It was one of Israel's single greatest military losses.

    After an inquiry, Israeli authorities declared it an accident. The explanation: Bottled cooking gas had exploded on the first floor, knocking apart the flimsy walls. But in a clashing version of history, Palestinian militants later claimed that a teenager named Ahmad Qasir died delivering a bomb.

    Five months later, a truck loaded with explosives drove into the U.S. Embassy in Lebanon's capital, Beirut, killing 63 people. On 23 October 1983, two more vehicle bombs hit Beirut, one striking a building full of French peacekeepers, killing 58, the other destroying the city's U.S. Marine barracks, killing 241. On 4 November 1983, yet another early morning explosion occurred in Tyre, destroying buildings and killing 29 Israeli soldiers and 32 prisoners. Several more blasts hit other sites in December 1983.

    Looking back on the carnage, experts see this period as the dawn of modern, very lethal terrorism. Powered by high explosives and enabled by modern transport systems, much of it has been carried out by young men who knew they would die in the attack. This disturbing form of violence became a global concern after hijackers crashed planes into the Twin Towers in New York City and two other U.S. locations, killing 2996 people on 11 September 2001. Fear surged, and with it a demand for information. People wanted to know who might become a terrorist, what motivates such killers, and what could be done to stop them.

    The U.S. government has spent millions of dollars since 9/11 on efforts to answer such questions. It has invited academics to help and encouraged publication in open, peer-reviewed journals. This has produced a bounty of new data sets, case studies, opinion surveys, books, and journal articles. The methods and aims adopted by researchers vary—so much, says economist Eli Berman of the University of California, San Diego (UCSD), who has leapt into this field, that results sometimes seem “like spaghetti.” He counts himself part of a new cadre of investigators who strive for quantitative rigor and testable theories.

    Bloody Beirut.

    The destruction of U.S. Marine Headquarters in Lebanon by a suicide bomber in 1983 marked a lethal new era of terrorism.


    The study of suicide terrorism in particular has been a “growth industry,” says political scientist Martha Crenshaw of Stanford University in Palo Alto, California, a longtime terrorism researcher. She says the work is uneven but becoming more sophisticated. That opinion is echoed by Gary LaFree, director of the National Consortium for the Study of Terrorism and Responses to Terrorism (START) at the University of Maryland, College Park. “There's been a huge improvement in what we know about terrorist groups,” says LaFree, who is proud that in recent years, “we've cracked into most of the major journals.”

    The work has challenged some long-accepted notions—for example, that terrorists are pathological, driven by religious fanaticism, or spurred by poverty. It's now clear that many terrorists are well-educated and seemingly rational. Economists in particular have explored a disturbing notion, probing terrorism as a form of “altruism” and comparing it to exclusive churches and social aid groups. The studies of cultlike groups are fascinating, Crenshaw says, but don't answer the deepest questions about what makes terrorists tick: “I am not sure that we're closer to understanding” why some groups are violent and others are not.

    Give me data

    In the 1990s, “there was no one on the planet—at least in the unclassified world—who could tell you how much terrorism was going on,” LaFree says. Nobody was collecting and sharing data systematically, although many groups were working on isolated projects. LaFree, who trained as a criminologist, says that just before the 9/11 attacks, he “stumbled onto” a collection of global records describing 60,000 terrorist events assembled by the Pinkerton Global Intelligence Services—on note cards. They would become the core of the Global Terrorism Database, a free digital resource housed at START containing facts and narratives about more than 98,000 events ( that is available to the public. Regularly updated, it has become a key reference.

    Talking terror.

    Abu Bakar Bashir in Jakarta, later sentenced to prison for plotting to kill Indonesia's president.


    START was created in 2005 by the Department of Homeland Security (DHS). As one of 12 DHS centers of research excellence, it manages a network of grantees and trains scholars. The Department of Defense and the National Science Foundation together also back an effort known as MINERVA (Science, 30 January 2009, p. 576), funding open academic research on terrorism. Many universities have centers of expertise in terrorism.

    Scholars have generated hundreds, perhaps thousands, of terrorism studies in the last decade, LaFree says. Researchers have looked for common patterns in the “trajectories” of violent organizations and their leaders in different regions. They have minutely dissected polling data on attitudes to terrorism, examined the link between economic conditions and violence, and analyzed some major government counterterrorist strategies. Some researchers have conducted interviews with terrorists.

    Just after 9/11, public attention focused on the links between Islam, radical views, and terrorist killings. Scholars were quick to point out, however, that terrorism has a long and diverse history. Experts such as historian Bruce Hoffman of Georgetown University in Washington, D.C., trace terrorism back 2000 years, defining it as violence or a threat of violence by a nonstate actor with a public or political objective. Jewish fighters known as the Zealots used assassinations as a weapon against the Roman Empire in the 1st century C.E. The anarchist movement of the 19th century killed several European heads of state. In the 20th and 21st centuries, the governments of Britain and Spain, among others, have been battling bomb-wielding nationalists.

    Terrorism's Long Trail


    A timeline of terror, from 70 C.E. to 2012. [PDF]

    It's hard to get evidence on what motivates this violence, but some researchers have tried to do so by asking terrorists. Anthropologist Scott Atran—a prominent investigator at the University of Michigan, Ann Arbor, and the National Center for Scientific Research in Paris—has met with failed, captured, or inactive terrorists in such places as Indonesia, Gaza, and Morocco. The answers they give can seem strangely bland, Atran reports. Many young men told him they joined a violent group because a soccer-playing friend invited them to do so. Recruitment by action-oriented pals is key, Atran says. In testimony to Congress, in scholarly articles (see p. 855), and in a 2010 book, Talking to the Enemy, Atran argues that terrorists today are frequently “infected” with the “bug” of radicalism from peers. And that bug is a worldview built on “sacred values” that are not debatable but are linked to violent action. “It is critical to understand how ‘sacred values,’ such as devotion to God or country or dignity and honor, can motivate” the devoted to act violently, he writes.

    Atran doesn't consider fundamentalist religions the source of terrorism; he sees them as repositories of mostly benign values. Although terrorists have sacred values that may be linked to religion, he says, their creeds are action-oriented, alluringly risky, and violent. To counter them, Atran argues, governments need to disrupt the social networks that transmit hostile values and “deradicalize” recruits. Although provocative, such insights rely heavily on interviews and interpretation and are hard to test.

    A more down-to-earth idea has been proposed by political scientist Robert Pape of the University of Chicago in Illinois. Zeroing in on suicide terrorism, Pape and colleagues built a database of all the suicide attacks they could find from 1980 to 2001. Looking for commonalities, Pape found that such attacks were growing more frequent and more deadly over time—and that the overwhelming majority (178 out of 188) were linked to one political goal: forcing a democratic power to withdraw from territory it occupied. The evidence, he argued, belied the long-held notion that suicide attackers were mentally unbalanced or inspired by religious dogma. He noted that the most deadly post-1980 terrorists were the Liberation Tigers of Tamil Eelam—a Marxist, atheist group seeking control of Sri Lanka. (The government defeated them in 2009.)

    Lethal dreams.

    A bomb ignites in 2009 during a Muslim ceremony in Sri Lanka (left), set off by a member of the Liberation Tigers of Tamil Eelam (right). This Marxist outfit, now defunct, took credit for more suicide bombings than any other group.


    Pape and his University of Chicago colleague James Feldman returned to this theme with a larger database that now includes more than 2200 suicide attacks logged between 1980 and 2009. (They've made their data available at

    Praised by one reviewer as “rigorously empirical,” their analysis offers what the authors call “strong confirmation for the hypothesis that military occupation is the main factor driving suicide terrorism.” Occupations “account for” 87% of the more than 1800 such attacks examined since 2004, they wrote. They identified a message applicable to any democracy: Don't be an occupier where you're not welcome.

    A group of economists found this analysis weak, however, blasting it in a 2008 article in the American Political Science Review. Scott Ashworth, Joshua Clinton, Adam Meirowitz, and Kristopher Ramsay of Princeton University argued that Pape had “selected on the dependent variable”—examining only examples of the effect he was trying to study (suicide terrorism) rather than examining all occupations and trying to discern the response. It was a “classic mistake,” says Ashworth, who's now at Chicago. Ashworth also sees a contradiction in recent events: There have been suicide attacks in Pakistan, where there is no foreign occupation.

    Mapping terrorism.

    Martha Crenshaw's Web site at Stanford tracks the lineage of violent groups.


    Pape responds that, in a “slight modification” of his theory, he has categorized Pakistan as being “indirectly occupied” by the U.S. military. And this, he says, can also lead to suicide terrorism. Pape rejects the criticism of his methods, saying that in books he published in 2005 and 2010, he did exactly what the critics wanted: “I collected not a sample, but the entire universe” of military occupations and suicide attacks. He predicts that suicide attacks will “fade” in the next 5 years as U.S. troops leave Afghanistan.

    Economy of terror

    Economists have been delving into terrorism, too. They're probing questions such as whether the poor are more likely to carry out violent attacks (the consensus is that they are not), the internal dynamics of radical groups, and the effect of terrorist attacks on public opinion. Some of their findings are unsettling. For example, a 2010 paper by economists Esteban Klor and Eric Gould of the Hebrew University of Jerusalem compared polling data from Tel Aviv and Haifa before and after terrorist attacks that took place at different times. The researchers found consistent results: In both locations, more people favored compromise with the Palestinians after attacks than before.

    The best studies, says Ethan Bueno de Mesquita, a political scientist at the University of Chicago who recently surveyed the field, analyze “fine-grained” information about specific people and events. He says “a small group of scholars” is doing most of this work, beginning with a well-defined theory and gathering “microlevel” data, such as details of attitude changes before and after an event in a specific location. Focusing in this way makes it easier to search for cause and effect, de Mesquita says, but there's a tradeoff: Conclusions tend to be fine-grained, too.

    Unsteady rise.

    A change of methods partly explains the dip in records at the Global Terrorism Database at the University of Maryland.


    Berman, research director at UCSD's Institute on Global Conflict and Cooperation, is an economist who has also tried to develop a general hypothesis and gather micro-level data. His starting point is a bit “jarring,” he concedes: Terrorists see themselves as altruists and are “fairly rational actors.” The assumption is based on research by Israeli psychologist Ariel Merari and others, who have “gone into cells” to interview suicide attackers who didn't succeed or were caught, Berman says. He used interview data and records of attacks in Israel and Israeli-occupied territories to zero in on the deadliest groups. He argues that they rely on the same loyalty-boosting methods used by religious sects and mutual aid organizations.

    Berman's analysis is an outgrowth of work by economist Laurence Iannaccone on the dynamics of religious sects. A salient aspect of all sectlike mutual aid groups, according to Iannaccone and Berman, is that they raise barriers against newcomers and require big sacrifices from members. The advantage for aid groups is that they can exclude free riders; terrorist groups gain by weeding out potential defectors. The most restrictive sects—those that demand hours of devotion, impose dress codes, and rigidly control personal interactions—bind members in a kind of family that makes the group more “effective,” according to Berman. In the economy of terror, the more effective groups kill more people.

    Berman tested this idea several ways: for example, by analyzing all suicide attacks in Israel, Israeli-occupied territories, and Lebanon between 1981 and 2003, as he describes in his 2009 book Radical, Religious, and Violent. Using the number of fatalities per attack as a “lethality” index, Berman found that faith-based providers of social services were the most dangerous. Two out of seven groups stood out: Hamas and Hezbollah, topping the list with 63 and 44 fatalities per attack. The Palestinian Islamic Jihad ranked third, at 37 fatalities per at tack, followed by others ranging from 31 to 1. Berman says data from Iraq follow the same pattern, linking high lethality and religious radicalism.

    Dead certainties.

    Anthropologist Scott Atran studies “sacred values” held by terrorists.


    de Mesquita says Berman's studies are “the most interesting work about terrorism and religion,” adding that they're not really about religion, but about methods to boost group commitment and avoid infiltration. Many questions remain unanswered, however. For example, why are some sectlike groups benign while others are terribly violent? Berman concedes that he doesn't have an answer.

    There are many gaps in terrorism research, says social psychologist and expert on radicalization, Clark McCauley of Bryn Mawr College in Pennsylvania. He is concerned, for example, that by rejecting the idea that pathology lies at the heart of terrorism, “maybe we went too far” in looking for explanations in normal group behavior. McCauley recently studied 200 criminal cases linked to al-Qaida and discovered that 40 involved “lone wolf ” terrorists. A recent example is Mohamed Merah, who was killed in a shootout with police in March after gunning down citizens and paratroopers near Toulouse, France. Another suspected of acting alone is U.S. Army Major Nidal Malik Hasan, accused of an attack at Fort Hood, Texas, in November 2009 in which nine people were killed and 30 injured. This suggests that lone actors are a significant subcategory deserving a second look, he says. After preliminary research, he has a hunch that these people may not be distinguished by an emotional deficit but by a surplus of empathetic feelings for perceived victims—and a willingness to act on their behalf.

    McCauley is keenly aware that there are few certainties in the study of terrorism—so much so that he considers himself “a connoisseur of holes.” The biggest hole he sees is the failure to keep good data on governmental attempts to counter terrorism and their consequences. “We don't know when we instituted a new initiative and we don't know when we changed it, or stopped doing it,” he says. He says the demand for objective data is growing.

    Like McCauley and other terrorism researchers, Crenshaw acknowledges that the gaps can be frustrating. Even some widely reported events—such as the 1982 blast in Tyre—remain murky, their cause disputed. But Crenshaw says the field now has excellent public databases and a growing community of investigators. We know a lot more about terrorism than we did a decade ago.

  7. Terrorism's Long Trail



  8. In Battle

    Tribal Roots in South Sudan

    1. Eliot Marshall

    Conflict has long been part of the cattle economy of Jonglei State in South Sudan. But the impact of fighting has intensified since the 1990s, according to many Sudan analysts.

    Murder was on the agenda for New Year's day in Pibor, a remote town in Jonglei State in South Sudan. Isolated by bad roads and poor communication, Pibor is home to a people known as the Murle, feared by their neighbors and also often attacked by them. In late 2011 and early 2012, a spokesperson for a group of the neighboring Lou Nuer tribe made the ultimate threat. The New York Times and other papers quoted him as saying that they intended to “wipe out the Murle tribe on the face of the earth.”

    A vigilante army of more than 6000 young Nuer men mobilized in late December, armed with automatic rifles and knives, calling themselves the White Army. They accused the Murle of constantly stealing cattle and kidnapping women and children—a charge frequently made across Jonglei State, homeland of the Murle, Nuer, Dinka, and other tribes. The incitement to attack, the White Army said, was a Murle raid against the Nuer in August 2011 in which 600 people were killed, women and children were kidnapped, and tens of thousands of cattle were stolen.


    At the end of December, the vice president of South Sudan's national government, Riek Machar, met the warbound Nuer to plead for peace. He failed. The government stationed 400 troops near Pibor but did not intervene. United Nations peacekeepers, with a force of 400 in the area, also kept their distance. As the White Army marched on Pibor, burning huts along the way, U.N. officials advised people to flee.

    The town emptied, and more than 50,000 people hid in the bush without food, fresh water, or shelter for a week. When the Nuer began to withdraw on 3 January, the Murle side said that 3000 people had died or been killed and hundreds of thousands of cows taken. A U.N. official said the number of deaths was “nowhere near” that high, although still unknown.

    Like January's death toll, the start of this violence is hard to pin down. Ask Gai Bol Thong, a Nuer partisan in Seattle, Washington, and an advocate of the White Army: He says the attacks and revenge strikes between the Murle, the Nuer, and others in Jonglei “began before the liberation of Sudan” from colonial rule in 1956. Such raids may have been going on for centuries.

    Conflict has long been part of the all-embracing cattle economy of Jonglei. The animals are food (there's little agriculture), the embodiment of wealth, and a prerequisite for fatherhood. To start a family, a man must have cattle to buy a wife. The cattle are herded by armies of young men, who clash with others as they move cows to fresh feeding areas, often through neighbors' lands. Cattle herders may also be cattle takers, and rustling is a time-honored way of securing a dowry. Children are part of this economy, too: Boys may be abducted to work as herders by a competing tribe, and girls may be taken to be exchanged for cattle. The Murle are not the only ones engaged in this trade.

    These patterns may be centuries old, but the impact of fighting has intensified since the 1990s, according to many Sudan analysts. Factions in a long civil war between northern and southern Sudan distributed automatic weapons far and wide; these are now in the hands of uncontrolled militia. The government of South Sudan, just 10 months old, hasn't been able to confront the anarchy at home, in part because it is still fighting on the border with Sudan. (Ethnic battles appear at least as severe within Sudan, which is governed by the world's only sitting head of state charged with genocide by the International Criminal Court, for attempting to eradicate nomadic tribes in Darfur.)

    For now, South Sudan's leaders have chosen a simple remedy to combat violence. In April, they launched a campaign to confiscate weapons from civilians. Hundreds have been collected already, but experts say this is not a viable long-term strategy. John Young, a Canadian political scientist who tracks the conflict in Jonglei for the Carter Center in Atlanta, notes that it has been tried before. People can now get guns when they want them. Disarmament, he wrote in 2010, is “unlikely to prove enduring” because it does not address the underlying causes of conflict “nor the deep poverty of the people which has made abductions a lucrative business.”

    The solution, as many investigators have concluded, is to provide real public security, improve roads and communication, stimulate the economy, and ensure fair and honest governance.

  9. Roots of Racism

    1. Elizabeth Culotta

    Humans everywhere divide the world into “us” and “them.” Why are we so tribal?

    You're alone in a dark alley late at night. Suddenly a man emerges from a doorway. If you are a typical white American and he is a young black man, within a few tenths of a second you will feel a frisson of fear as your brain automatically categorizes him. Your heart beats faster and your body tenses.

    In this event, nothing happens. He glances at you and moves away. You walk on, feeling foolish for fears based merely on his membership in a racial group.

    Tension and suspicion between groups—whether based on racial, ethnic, religious, or some other difference—fuel much of the world's violence. From the enduring feuds of the Middle East and Northern Ireland, to the vicious raids of South Sudan, to the gang warfare that plagues American cities, even to bullying in schools and skirmishes between fans of rival sports teams, much of the conflict we see today erupts because “we” are pitted against “them.”

    Some of the prejudice behind these conflicts is not conscious: Your fear spiked in that dark alley before your conscious brain had even registered the young man's skin color. This prejudice apparently stems from deep evolutionary roots and a universal tendency to form coalitions and favor our own side. And yet what makes a “group” is mercurial: In experiments, people easily form coalitions based on meaningless traits such as preferring one painter over another—and then favor others in their “group,” giving them more money in games, for example. “In arbitrarily constructed, meaningless groups with no history, people still think that those in their ingroup are smarter, better, more moral, and more just than members of outgroups,” says Harvard University psychologist James Sidanius.

    A wide and deep literature has explored these innate biases in the 40 or so years since they were first discussed. Now researchers have begun to ask why humans are apparently primed to see the world as ingroups and outgroups. What factors in our evolutionary past have shaped our coalitionary present—and what, if anything, can we do about it now?

    Several avenues of research are probing the origins of what many psychologists call ingroup love and outgroup hate. Researchers are testing the implicit biases of young children and even primates, and devising experiments to ratchet bias up and down. Evolutionary researchers are trying to parse the group environments of our ancestors and are debating just how big a selective pressure came from outgroup male warriors. “The origin of all this is the all-consuming question of the past few years,” says Harvard psychologist Mahzarin Banaji.

    Group love

    Fear of “them.”

    Members of outgroups can spark automatic, implicit prejudice.


    For many researchers, our cruelty to “them” starts with our kindness to “us.” Humans are the only animal that cooperates so extensively with nonkin, and researchers say that, like big brains, group life is a quintessential human adaptation. (In fact, many think big brains evolved in part to cope with group living.) Studies of living hunter-gatherers, who may represent the lifestyle of our ancestors, support this idea. Hunter-gatherers “cooperate massively in the flow of every imaginable good and service you can think of,” says anthropologist Kim Hill of Arizona State University (ASU), Tempe, who has studied hunter-gatherers for 35 years. “Anything you need in daily life, the person next to you will lend you: water, sticks for firewood, a bow and arrow, a carrying basket—anything.”

    Thus the group buffers the individual against the environment. “Our central adaptation is to group living,” says psychologist Marilynn Brewer of the University of New South Wales in Sydney, Australia. “The group is primary.”

    When the ingroup is loved, by definition there must be a less privileged outgroup. “One can be expected to be treated more nicely by ingroup members than by outgroups,” as Brewer put it in a seminal 1999 paper. “It is in a sense universally true that ‘we’ are more peaceful, trustworthy, friendly, and honest than ‘they,’” she wrote.

    If groups compete for territory or resources, favoring the ingroup necessarily means beating the outgroup and can escalate into hostility, Brewer notes. Several other researchers (see p. 876) have recently argued the reverse: that over time, hostilities between groups fostered ingroup love, because more cooperative groups won battles. Whichever came first, researchers agree that outgroup hate and ingroup love may have spurred each other.

    A warrior past?

    Battles among hunter-gatherers such as these Papua New Guineans suggest to some that ancient wars shape modern fears.


    In the United States and some other countries, the sharpest division between groups is often racial. But researchers agree that it's not that white people have evolved to be suspicious of black people per se or vice versa. Such an evolved prejudice could only arise as the result of frequent negative interactions between races in the past, explains anthropologist Robert Boyd of the University of California, Los Angeles. But thousands of years ago, people didn't cross continents to meet each other. “In the distant past, we had very little experience interacting with people who were physically very different from us,” Boyd says. “That's only since 1492. Ethnic distinctions, however, are presumably quite old.” Thus racial prejudice is a subset of a much broader phenomenon. “This is not just about racism,” says psychologist Susan Fiske of Princeton University.

    The targets of outgroup prejudice vary from culture to culture and over time—Sidanius refers to them as “arbitrary set” prejudices. In Sri Lanka, it may be Tamils; in Northern Ireland, Catholics or Protestants; in India, the Untouchables. Fiske notes that the world over, the greatest prejudice is often aimed at people without an address, such as gypsies and the homeless. Whoever the target, we have a psychological system that prepares us to “learn quickly, in whatever cultural context we're in, what are the cues that discriminate between us and them,” says psychologist Mark Schaller of the University of British Columbia, Vancouver, in Canada.

    This doesn't mean that prejudicial behavior is inevitable, Schaller says. “These prejudices tap into very ancient parts of our minds, and it's happening at a very quick, automatic level,” he says. “But we have recently evolved parts of our brains that allow us to engage in slower, more rational thought. When I experience that fear in a dark alley, it may take me another half-second for a more rational thought to kick in, but I'll get there, if I have the motivation and means to do so.”

    Shoot or don't shoot

    Religion, not race.

    Groups divide along many axes, including religion, as when protesters and police clash in Northern Ireland.


    Psychologists have become master manipulators of prejudice in the lab, with clever experiments that reveal underlying biases. In the Implicit Associations Test, for example, people are asked to rapidly categorize objects and faces; the pattern of mistakes and speed shows that people more quickly associate negative words such as “hatred” with outgroup faces than ingroup faces. “It takes significantly longer to associate your ingroup with bad things and the outgroup with good things,” Sidanius says. In disturbing tests using a video game, people looking at a picture of a person carrying an ambiguous object are more likely to mistake a cell phone for a gun and shoot the carrier if he is an outgroup male.

    This type of bias shows up in all cultures studied and in children; it also appears in people who say they are not prejudiced and who work consciously for equality. “This is in every single one of us, including me,” Banaji says.

    It starts young. In work in review, Yarrow Dunham of Princeton, Banaji, and colleagues found that Taiwanese toddlers assumed that a smiling racially ambiguous face was Taiwanese, but a frowning one was white; white, American 3-year-olds similarly preferred their ingroup.

    Whose group?

    Taiwanese toddlers thought racially ambiguous faces like these were Taiwanese when smiling and white when looking angry.


    Even our primate cousins categorize others into ingroups and outgroups. Chimps obviously have outgroup bias: They sometimes band together and attack and kill members of other troops. Last year, a study managed to uncover primates' implicit expectations of “us” and “them” for the first time.

    Laurie Santos of Yale University, working with Banaji and others, adapted a psychological test for rhesus macaques, group-living monkeys whose lineage diverged from ours about 25 million to 30 million years ago. The researchers assumed that the macaques would stare longer at outgroup faces, who might be more dangerous, or at groups of photos that paired things they liked with outgroup faces they didn't like. In a series of experiments published in the Journal of Personality and Social Psychology, Santos and colleagues found that macaques looked longer at photos of outgroup members. They also looked longer at photos of outgroup members next to pictures of fruit, and at photos of ingroup members with spiders and snakes.

    Seeing such apparent bias in primates suggests it is evolutionarily ancient. This “coherence of results across species and ages is satisfying,” Banaji says, and tells us that outgroup bias is “core to our species.”

    While these researchers probe bias in different populations, others are exploring how to manipulate it. Our attitudes toward outgroups are part of a threat-detection system that allows us to rapidly determine friend from foe, says psychologist Steven Neuberg of ASU Tempe. The problem, he says, is that like smoke detectors, the system is designed to give many false alarms rather than miss a true threat. So outgroup faces alarm us even when there is no danger.

    On the lookout.

    Macaques stared longer at photos of the faces of outgroup members than at ingroup faces.


    Neuberg and Schaller have studied what might turn this detection system up and down. When you feel threatened, you react to danger more quickly and intensely; people startle more easily in the dark. That's why prejudice rears its head in a dark alley rather than a well-lit field. In a variety of studies, Neuberg, Schaller, and their colleagues have manipulated people into feeling unconsciously more fearful or confident and found that measures of outgroup bias respond. Canadians taking tests in the dark rated Iraqis as less trustworthy and more hostile than other Canadians. Sinhalese in Sri Lanka stereotyped Tamils as more hostile after being primed with a geographic context that made them feel outnumbered. And white undergraduates were more likely to misperceive anger on the faces of black men—but not whites—after watching a scary scene from the movie The Silence of the Lambs. Schaller adds that some people seem to go through life more cognizant of threats than others, and that prejudice is more easily intensified in these people.

    These findings can inform real-life tragedy, researchers say. On the evening of 26 February, a Hispanic man named George Zimmerman shot an unarmed black teenager, Trayvon Martin, in a gated community in Sanford, Florida. Zimmerman told police that he followed Martin suspecting criminal activity, was attacked, and fired in self-defense. Researchers cannot speak to what happened that night. But when Zimmerman first spotted Martin, the situation was a “perfect storm” for triggering feelings of vulnerability and implicit prejudice, Neuberg and Schaller say.

    It was dark and raining. Martin, 17, though slender, was tall. Zimmerman, 28, was quite alert to crime in his neighborhood; he had started the neighborhood watch. Martin was young, male, and black, an outgroup stereotyped as dangerous by whites and Hispanics in the United States. “We would predict that under those circumstances this kind of thing would happen more often,” Neuberg says.

    If certain situations turn implicit prejudice up, can it be turned down? Schaller notes that making people feel safer can moderate this bias, whether through specific priming or more generally with lower crime rates or a better economy. To unconsciously prime her own mind, Banaji has created a screen saver that displays stereotype-smashing images. Other researchers say that deliberately engaging the slower conscious mind may help. For example, in addition to skimming all job applications quickly, a manager might read the files of minority applicants with care.

    Men in the crosshairs

    Martin is a good example of how outgroup prejudice falls hardest on men, Sidanius says. He argues that this has specific evolutionary roots: Because it is men who typically make war, it was outgroup men who attacked our ancestors; it was also men who were more likely to be killed in combat. “Back in the Pleistocene, outgroup males really were dangerous,” he says.

    If natural selection has shaped our minds to be wary of outgroup males, then they should face more prejudice than outgroup women, says Sidanius, an African American who himself was the target of hate crimes as a young man. He and colleagues have assembled a devastating catalog showing how this is true for black men in America. As compared with black women, black men are more likely to be victims of hate crimes, receive harsher jail sentences for comparable offenses, pay more money for cars—the list goes on and on. Data suggest that West Indian and South Asian men in the United Kingdom face similarly disproportionate bias, Sidanius says.

    Building on these ideas, in March, Melissa McDonald of Michigan State University in East Lansing and colleagues proposed what they called the “warrior male hypothesis,” arguing that natural selection has shaped men's minds, more than women's, toward belonging to coalitions. They predict that men are more prejudiced than women, and some data show this.

    But others aren't so sure that intergroup war was a prominent feature of our prehistory (see p. 829). Foragers depend on farflung networks to gain access to the social and natural resources of others, notes anthropologist Polly Wiessner of the University of Utah in Salt Lake City. The !Kung of Africa, whom Wiessner studied for decades, “may travel for hundreds of miles to visit exchange partners in less familiar areas, with no fear of unknown males,” she says.

    Whether or not humans have evolved to fear outgroup men per se, researchers agree that we are prone to categorize and sometimes fear outgroups. “What we're arguing is a natural preference for drawing ingroup-outgroup boundaries. It can be race, religion, nationality, dialect, or arbitrary set differences,” Sidanius says. “But once those boundaries are drawn, people like to discriminate across them.”

  10. In Battle

    Preening the Troops

    1. Elizabeth Pennisi

    Green wood hoopoes boost in-group cooperative behavior by preening helpers more before and after rallies, showing that it doesn't require a primate brain to reinforce social bonds in the face of an outside threat.

    “Kek-ek-ek-ek! Kek-ek-ek-ek!” That's a call to arms for the green wood hoopoe, a 44-centimeter-long South African bird that forms alliances among a breeding pair and up to 10 helpers. As soon as one bird starts calling, others join in, in a display of solidarity that can be heard a kilometer away.

    Listen closely and not one but two groups typically sound off, taking turns to see which one calls loudest and longest. The birds depend on these “rallies” to settle disputes, which today represent one of the best documented examples of group conflict management in animals other than primates, says behavioral ecologist Andrew Radford of the University of Bristol in the United Kingdom. “Nonprimates such as birds … offer many good opportunities to generate hypotheses about intergroup behavior,” says anthropologist Richard Wrangham of Harvard University.

    Just as humans shower hometown athletes with praise at pep rallies and postgame celebrations, wood hoopoes boost in-group cooperative behavior by preening helpers more before and after rallies. This avian behavior shows that it doesn't require a primate brain to reinforce social bonds in the face of an outside threat.

    In wooded ravines bordering rivers, the birds set up permanent territories that stretch about 1.5 kilometers long but are typically just several hundred meters wide. Groups of two to 12 birds face off whenever one crosses or nears a territory boundary. Losers retreat, allowing winners to barge in to feed on insects and other invertebrates in the treetops, and to nose around in roosting holes before returning to their original territory.

    When groups face off in a display like the wood hoopoe rallies, the number of individuals can matter more than the size or stamina of any one member (see p. 838). Roaring in lions, for example, indicates a pride's size and its fighting ability.

    Radford created artificial contests by playing prerecorded rallies to wood hoopoes. During a rally, the birds in a group sit close together and rock back and forth, calling for about 10 seconds per bout. The calls are coordinated, with each bird cackling in turn “like a well-oiled chorus,” Radford says. Rival groups take turns sounding off.

    Radford found that a group cackled longer if the challenging band was larger. Longer responses even by small groups may be an attempt to disguise their small number, he says. Some contests were short, less than 5 minutes, and usually won by resident birds no matter their group size. But other battles stretched more than 15 minutes, with calling bouts getting longer as residents matched intruders' efforts. Larger groups usually won such extended contests.

    Researchers have long suggested that conflict between groups strengthens the cooperation and solidarity within them; some say this has shaped humans' tendency to form coalitions (see p. 825). Experiments have borne out this idea, such as those pitting groups of college students playing a cooperation game against each other. Radford investigated whether the same held true in the green wood hoopoes, looking at friendly gestures among birds after rallies. “Looking at the aftermath, virtually no one had done that,” he says. “But it has clear parallels with human battles.”

    Three cheers.

    Green wood hoopoes foster group cohesion as they prepare to match calls against other groups.


    To measure affiliative behavior, he noted incidents of allopreening in which one bird picks at the feathers of another, often to remove parasites. He watched 12 groups of birds, charting rallies as well as who groomed whom and when. Short contests didn't lead to a change in preening, but long conflicts, particularly when lost, spurred more preening of group members, particularly by the dominant pair, he reported in 2008. He thinks the post conflict preening helps reduce social stress caused by the conflict. The dominants may use grooming not only to soothe the helpers, who do the lion's share of cackling during contests, but also to “enhance the helper's participation in the future,” similar to how primates groom one another to strengthen bonds, he says.

    “This important finding indicates that wood hoopoes have evolved a partial solution to a collective action problem” of some group members working harder than others, Wrangham says. “It's the kind that is worth looking for in primates.”

    But Radford also noticed that body preening increases when the birds simply enter a part of their territory where past conflicts occurred. When the birds approach their borders, “there's a preemptive change in behavior,” he notes. The birds weren't just more anxious, because self-preening— considered a sign of increased stress—did not increase, he reported online on 7 July 2010 in Biology Letters. This finding parallels what is seen in soldiers, who bond more when they enter combat zones than when stationed back from the front lines. “If we are finding these links in birds and not just among humans, it suggests these links could be very important over evolutionary time,” Radford says.

  11. The Battle Over Violence

    1. Andrew Lawler

    Under the long shadow of Rousseau and Hobbes, scientists debate whether civilization spurred or inhibited warfare—and whether we have the data to know.

    It takes a village.

    Violence appears in every culture but becomes more complex as societies develop.


    With its World Wars, genocides, and innumerable revolutions and civil wars, the 20th century was the bloodiest in human history. World War II alone left some 60 million dead—2.5% of the world's population, or the total number of people who lived in Europe during the Middle Ages. Yet a group of researchers argues that complex industrialized societies, even Nazi Germany or Stalin's Russia, are far safer places to live than among smaller groups of hunter-gatherers or farmers, where tribal feuds and homicide typically felled more than 10% of the population.

    “This is the paradox,” says Azar Gat, a military historian at Israel's Tel Aviv University and author of a major study on the subject. “Mortality was higher before the state appeared.” Perhaps the most prominent advocate of this view is Harvard University neuroscientist Steven Pinker, who argued in his 2011 book The Better Angels of Our Nature that complex societies have spurred a clear and measurable decline in the rate of violence compared to the bad old days when humans lived primarily in more intimate, small groups. With the centralized state, he writes, “came a more or less fivefold decrease in rates of violent death.”

    Many archaeologists and anthropologists agree that the odds of dying violently are lower in a modern nation today than in medieval Europe, a comparison prominent in Pinker's book and one that draws on the relatively plentiful records of the past 5 centuries. But a number of researchers dispute the more general assertion that violence decreases as a society becomes more complex. This theory, they maintain, ignores yawning gaps in data as well as the enormous diversity of rates and forms that violence takes from one society to the next.

    “The variability across cultures in how violence is used is staggering,” says Debra Martin of the University of Nevada, Las Vegas, who specializes in ancient health and violence. Adds Henry Wright, an archaeologist at the University of Michigan, Ann Arbor, who has studied violent death in the ancient Near East, China, and Africa: “I don't have the statistics, and I don't think anyone else does either.”

    Beneath the argument over numbers is a controversy reaching back several centuries. Swiss-born French philosopher Jean-Jacques Rousseau argued in the 18th century that complex society brought about greater inequality, oppression, and fear. Thomas Hobbes, a 17th century English philosopher, argued that life without the social order and a strict hierarchy was “solitary, poor, nasty, brutish and short.” Whether civilization corrupted humans or saved them has made sparks among scholars ever since.

    Early and mid-20th century studies of ancient people seemed to confirm a more Rousseauian view in which scattered populations, minimal technology, and ample game limited human violent conflict in the distant past. Cave paintings in Europe from about 40,000 to 10,000 years ago portray hunting of animals but not human-on-human conflict. Archaeologists found little evidence of murder and organized violence before the military empires of the Near East sprang up 4000 years ago. Studies of living hunter-gatherer tribes in the first half of the 20th century appeared to show low rates of violence: American anthropologist Margaret Mead concluded in 1935 that in the Arapesh tribe of New Guinea, “both men and women are naturally maternal, gentle, responsive, and unaggressive.” And initial primate research found fewer violent tendencies in humanity's nearest cousins.

    This Rousseauian perspective began to lose favor a half-century ago. Early Neolithic cave paintings in Spain recorded in the 1980s show humans shooting arrows at one another. Primatologists discovered that warfare and murder are not unusual among chimpanzees. And more intensive anthropological work began to shed light on a more violent side of small-group society.

    In 1996, anthropologist Lawrence Keeley of the University of Illinois, Chicago, published War Before Civilization: The Myth of the Peaceful Savage, based on a wide range of data from prehistoric sites, modern hunter-gatherers, and other groups living outside established states. He concluded that more than 90% of human groups engage in war, including small-scale groups. For those people living outside states, Keeley estimated that the average annual rate of death in warfare was 524 per 100,000 people—twice that of the famously warlike Mesoamerican Aztecs in the 16th century. By contrast, even during the bloodiest years of World War II, Russia and Germany had violent death rates of about 140 per 100,0000 citizens. He concluded that living in a small-group society is significantly more dangerous than being a member of a more complex one.

    Pinker uses Keeley's data and unpublished studies by economists to argue that complex society brought standing armies, laws, walled cities, and other innovations that restricted tribal fighting and protected the average citizen from violent crimes. “Hobbes understood this without having the data,” Gat adds.

    Pinker blames what he calls “anthropologists of peace” for distorting the record on small-scale group violence. “The classic ‘gentle people’”—the Semang of the Malay peninsula, !Kung in Africa, and Central Arctic Inuit—“turned out to have higher homicide rates than those of American cities,” Pinker says. He criticizes what he calls a single-minded determination “to make hunter-gatherers seem as peaceful as possible.”

    The great debate.

    Debra Martin and Gwen Robbins Schug argue that violence varies too widely for generalizations, while Azar Gat and Steven Pinker say civilization brought peace (clockwise from upper right).


    Such charges puzzle some biological anthropologists and archaeologists—the kinds of scholars who gather the type of data used in this debate. They do not argue for a Rousseauian perspective. But that doesn't mean they're ready to embrace a Hobbesian view, either. They find the data too weak to support such sweeping claims and add that the statistical averaging done by Pinker and Gat erases the enormous variation in small-scale societies. Pinker “misused the bioarchaeological record by selecting a few populations … biased toward supporting his argument,” complains archaeologist Gwen Robbins Schug of Appalachian State University in Boone, North Carolina.

    In a 2011 review of existing violence data on both nonstate and state societies, criminologist Amy Nivette of the University of Cambridge in the United Kingdom found persistent problems with data on small groups, which typically do not record violent deaths. Different researchers, for example, assign different rates of violence to the !Kung. And tiny numbers in small groups make statistics unreliable. Keeley cites the Polar Eskimo, for example, but given their small population, a single Eskimo murder every 50 years would equal the current rate for the United States. To sort out these problems, “researchers must carefully and more systematically consider what is meant by ‘low’ levels of violence,” Nivette says. “Violence in nonstate societies is more complex and changeable than the stylized Hobbes-Rousseau dichotomy,” she concludes.

    Other researchers say they are more interested in the variability in violence than the averages. For example, Keeley's data do show dramatic differences in homicide rates among huntergatherers who tend to live in small bands and among agriculturalists in modest villages. More recent anthropological surveys by other researchers note that the Amba of Uganda and the Eastern Pueblo of New Mexico have fewer than two murders per 100,000 people, while the Hewa of New Guinea top 700 per 100,000, and the 19th century Kato of California reached 1450. By comparison, the U.S. national homicide rate peaked just below 10 per 100,000 during the crime-ridden late 1980s; in 2010, the cities of Baltimore and Detroit, which rank as the most violent, were nearly tied with just over 34 homicides per 100,000.

    Those living in small-group societies “are not always peaceful, and not always at one another's throat,” says George Milner, an anthropologist at Pennsylvania State University, State College, who has spent much of his professional life examining ancient skeletons with signs of physical injuries. “We are beginning to see a highly varied picture—times when conflict is quite severe, as with today's Yanomamö [foragers of northern South America], and also places where there are prolonged periods of peace.”

    He cites the example of the Hopewell culture of the 1st through 5th centuries C.E. in eastern North America, which appears to have been “socially permeable,” allowing traders to safely transport obsidian from sources in what is today Wyoming as far east as Ohio. Such ease of movement would have been unthinkable before and after that era, when violence between groups was more common. The interesting question, Milner says, is what changed. “To see this from a solely Hobbesian viewpoint misses the real story,” he adds. “We want to know why people switch from peace to war and back again.”

    Pinker acknowledges that there are exceptions to the rule of more violent small-scale societies, but he maintains that “the bulk of the distribution includes massive death rates by violence.” His point, he says, is that more organization leads to less chaos—and less violence. That extends to farming groups as well as hunter-gatherers, notes Gat, who cites the “staggeringly violent mortality rates” among agricultural groups in New Guinea.

    Pinker and his critics agree that questions of how violence shifted in form, as well as rate, are worthy of study. To do that, scientists say they must gather more data on the ways of violence among a host of societies, both large and small, across the globe and through many millennia. Milner argues for tracking the pulses of conflict that characterize all societies to understand the conditions that spark war and pave the way to peace. But that goal will no doubt require fractious researchers from many disciplines to themselves lay down their arms and work more cooperatively.

  12. In Battle

    Tweeting the London Riots

    1. John Bohannon

    People have rioted since the dawn of civilization. But online social media may be radically changing how riots form and spread.

    It started when four gunshots rang out on Ferry Lane in north London on 4 August last year. Television and radio reported the known facts of the case: Police officers had killed a 29-year-old man, possibly unarmed, in an impoverished neighborhood. Online, rumors and rage spread rapidly. Some accused the police of executing the man. Two days later, London convulsed with the worst riots in living memory. As traditional media outlets scrambled to keep up, people used online social media tools to share information and opinions. Soon the riots infected other U.K. cities. For five nights, mobs clashed with police, smashed windows, looted shops, and started fires across the United Kingdom.

    During the riots, Emma Tonkin watched and worried like everyone else. The streets had erupted in chaos in nearby Bristol, and she wondered whether her housemate should go to work the next day. “Why was this happening, the senseless violence?” she says. Tonkin, who is an information technology researcher at the University of Bath, studies online social media. According to press accounts, rioters were using media such as Twitter to incite and organize. That was a claim that she knew how to test. So she assembled a research team, harvested 600,000 tweets from the days of the riot, and got to work.

    People have rioted since the dawn of civilization. But online social media may be radically changing how riots form and spread. Until recently, it was difficult for a large crowd of strangers to coordinate, be it for peaceful protests or violent riots, says Katharina Zweig, a network scientist at the University of Heidelberg in Germany: “You needed to contact a lot of people, make sure that nobody else could overhear your message.” Now, with diffuse conversations of millions of people taking place on platforms like Facebook and Twitter, “you can direct messages to many other people at no cost.”

    Tonkin and her colleagues are part of a growing group of social scientists tracking online conversations, hoping to make sense of how they affect real-world events. For example, after the London mobs dispersed, The Guardian newspaper launched a collaboration with researchers from the London School of Economics and elsewhere to study whether social media had contributed to the magnitude of the riots, interviewing arrested looters, reviewing court records, and harvesting 2.6 million tweets. The stakes were high. Some had advocated giving the government the power to temporarily shut down social media sites.

    Extracting meaning from social media is no easy task. In the case of Twitter, the enforced brevity gives rise to an abbreviated language that can baffle computer analysis. Another problem, Tonkin says, is that computers “have no sense of humor.” Take, for example, “I am going to destroy that city,” which sounds like terrorism to American ears but in Britain signals the intention to get drunk on holiday. “Our tools aren't good enough yet,” she says, so researchers must still use human eyes to reality-check social-media data. And because Twitter users generate more than 200 million tweets per day, the crucial first step is to find what is relevant. Luckily, Twitter users organize their conversations with easily searchable hashtags, such as #londonriots.


    Tonkin's first surprise was how visual those conversations were. About half of the tweets contained Web links, and the most popular led to photographs snapped by people in the street. One of the most iconic showed a sign on a London restaurant: “Due to the imminent collapse of society, we regret to announce we are closing at 6 p.m. tonight.”

    To explore whether rioters used Twitter to organize attacks, Tonkin's team tagged tweets by the sentiments they expressed, checking samples by eye to make sure the algorithms didn't stray far from reality. On the basis of that analysis, Tonkin declares the site innocent. “Clearly, Twitter did amplify signals,” she says, acting as a megaphone both for those calling for violence and those calling for calm. But in her sample, inflammatory comments “were generally condemned” by those retweeting them. On the contrary, she says, the most popular use of Twitter for organizing was for cleanup crews.

    The Guardian team also absolved Twitter, but they fingered less transparent online social media. At least two people attempted to use Facebook to set meeting points for mobs. And based on their interviews, the researchers concluded that many rioters were indeed using BlackBerry Messenger to coordinate attacks. The private network links BlackBerry phone users but creates no public trace on the Internet.

    Of course, what was once private is increasingly shared online for all to see. Completely closed networks like BlackBerry Messenger may give way to Facebook, which is only superficially private. Governments and companies are already harvesting data from those online conversations. Could crowd phenomena such as riots one day be predicted, like weather? And by selectively shutting down social networks, could riots be controlled or even prevented? Possibly, Zweig says, but “I'm actually not sure we would want that as a society.”

  13. Civilization's Double-Edged Sword

    1. Andrew Lawler

    Recent archaeological finds from the Near East to Southeast Asia of ancient massacres raise questions about how violence changed as societies became more complex.

    Feast and famine.

    The rotting corpses of these Tell Brak victims were thrown into a pit with the remains of a massive barbecue.


    The corpses were left to rot by the hundreds in the hot Syrian sun. The victors whittled three dozen or more of the human arm and leg bones into pointed sticks, perhaps as tools to further desecrate the skulls. After at least a few weeks, the decomposing remains were dragged a short distance and pushed into a trash pit. Then the victors butchered cows, sheep, and goats for an extravagant barbecue. When they were done, the celebrants hurled the picked-over animal bones and the ceramic plates, made especially for the occasion, onto the decaying heap. Then they returned to their homes in the nearby prosperous city.

    As fighting engulfs the cities of modern Syria, the scene above is stark proof that mass warfare and civilization share a 6000-year history. The killing field at Tell Brak, about 500 kilometers northeast of Damascus near the Iraqi border, documents what is perhaps the world's oldest known large massacre or organized battle. When the violence occurred in about 3800 B.C.E., this settlement was evolving into one of the world's first fledgling cities.

    What took place here and at other sites where complex societies were starting to coalesce is of particular interest to researchers studying human violence. Some scientists argue that civilization replaced tribal anarchy with a more organized way of life that reduced rates of violence (see p. 829). But recent finds around the world suggest an upsurge, not a decline, in violence during the key period when societies transitioned from the simpler organization of tribes and chiefdoms into complex urban life.

    Even in the Near East, which has been intensively excavated for more than a century, scientists still grapple with fundamental questions about violence. There has been surprisingly little physical evidence for warfare, massacres, or even widespread murder here between the rise of agriculture around 10,000 B.C.E., when people were living in less complex groups, and the emergence of the Akkadian Empire around 2300 B.C.E. Recent finds in northern Syria, however, suggest that violence flared as urban life first began to take hold between 4000 B.C.E. and 3200 B.C.E.

    The large settlement of Hamoukar was destroyed around 3500 B.C.E.; a team there found hundreds of what they maintain are sling bullets (Science, 31 August 2007, p. 1164). Even more startling, in 2007 and 2008, University of Cambridge archaeologists found three mass graves dating from about 3800 B.C.E. to 3600 B.C.E. at Tell Brak. The oldest and largest grave was at least 20 meters long and 4 meters wide, and included a jumbled pile of at least several hundred people—by far the earliest undisputed example of an event of mass violence. In a 2011 paper in the Journal of Field Archaeology, dig director Augusta McMahon and colleagues note that the majority of the dead were between the ages of 20 and 35. Although poor bone preservation makes it difficult to establish gender and cause of death, the state of the disarticulated bones suggests that the individuals all died at the same time and were left in the open for weeks or months. Many of the victims had previous head injuries that had healed.

    The bodies' exposure, delay in burial, the careless collection of the rotting corpses—which ignored smaller bones from the hands and feet—and the casual disposal all “show an extraordinary disregard” for the victims, McMahon says, implying “that the dead were enemies.” That is underscored by the fact that more than 40 human bones were whittled to make tools, which McMahon suggests were used “to deflesh and empty trophy skulls,” given that some skulls have deep scratches. The apparent victors then celebrated an astonishing feast involving as many as 75 cattle and 300 sheep and goats. The remains of this repast were tossed on top of the corpses and then covered with dirt.

    Making their point.

    Tell Brak dig director Augusta McMahon examines a human bone shaped into a pointed tool.


    Whether the victims were locals or outsiders remains unclear. Study of tooth enamel from the bodies shows that at least some suffered from malnutrition. Tell Brak sits in a region prone to long-term drought and may have stored a surplus that drew hungry mobs; it also had workshops that harbored a wealth of beautiful objects that may have attracted envious invaders or spurred civil war.

    McMahon suspects internal strife because Tell Brak was so populous in this period that it was not very vulnerable to outsiders. She notes that the callous treatment of corpses may have been “a useful control device” to deter internal discontent. Whatever the fight, it was clearly an organized killing field. “Here you see mass violence … motivated or controlled by central authorities,” says archaeologist Henry Wright of the University of Michigan, Ann Arbor.

    Peaceable kingdom?

    On the other side of Asia, in the jungles of Thailand and Cambodia, complex society arrived several millennia later. Small villages predominated here for centuries, until about 900 C.E., when the first complex society arose around Angkor Wat in Cambodia.

    Until recently, there has been a remarkable dearth of evidence for violence here during the region's Iron Age before the rise of Angkor. In that period, which began about 500 B.C.E., iron tools and weapons appeared, and tribes appear to have coalesced into more organized chiefdoms run by an elite.

    But recent work shows that Southeast Asia was no peaceable kingdom just before or during the rise of Angkor. At a site in Cambodia called Phum Snay, about 80 kilometers northwest of what became the capital of Angkor Wat, archaeologists from Australia and New Zealand have found an array of sophisticated military artifacts such as swords (some more than 1 meter in length), daggers, spearheads, and epaulettes, as well as signs of violent conflict dating between 100 C.E. and 300 C.E.

    Remains of the day.

    Murder victims at Phum Snay show traumatized wounds and are examined by O'Reilly (inset, on left) and a colleague.


    Researchers estimate that of 30 skeletons from the town's cemetery, nearly a quarter had traumatic bone lesions, with males having more than females. Given the location and type of injury, these wounds are more likely due to interpersonal violence than accidents, says biological anthropologist Kathryn Dommett of James Cook University in Townsville, Australia, lead author of a 2011 Antiquity paper on the subject. And because soft-tissue wounds are not recorded, the level of violence likely was even higher.

    This data meshes with the results from digs across the border in Thailand. Combining satellite imagery with fieldwork, archaeologists have mapped dozens of large settlements surrounded by elaborate moats and ramparts from this era. “This is a period of intense political development and possibly increased competition,” says archaeologist Dougald O'Reilly of the Australian National University in Canberra, who led the Cambodian dig and has published on the Thai finds.

    That is borne out at the Thai site of Noen U-Loke. Archaeologist Charles Higham of the University of Otago in Dunedin, New Zealand, found evidence for intercommunity conflict in a sudden proliferation of iron projectile points in the Iron Age, including one lodged in the spine of a young adult male; an older female had her head cut and smashed. Other sites, such as Ban Wang Hai, a village just to the north, have yielded iron swords and other weapons not associated with hunting.

    O'Reilly says that the prevalence of “blunt force wounds” in this period points to either ritual warfare or a scramble to control resources such as iron. “There is a lot of evidence of increased warfare” just before the rise of Angkor, Higham says. O'Reilly believes that such conflict arose as populations increased along with competition for resources. As at Tell Brak, when other types of social complexity rose, so did the scale and complexity of warfare.

    Archaeologist Glenn Schwartz of Johns Hopkins University in Baltimore, Maryland, notes that “early complex societies were able to organize much more effective killing machines, given their administrative and technological capabilities and large populations.” So while laws, fortifications, and armies may have protected the bulk of the citizenry, when warfare did take place, it could be on a greater scale and ferocity than among the preceding smaller groups. “The nature of warfare changes,” says anthropologist Patrick Nolan of the University of South Carolina, Columbia. “The frequency may decrease but the scale goes up.”

    Historian Azar Gat of Tel Aviv University in Israel warns that the spectacle of large battles may mask the more important truth that a given individual in a complex society would be less likely to die from violence. “Ramses II took 20,000 troops to fight the Hittites,” he says of the 13th century B.C.E. Egyptian king. “But the population of Egypt was 2 or 3 million. They were largely sheltered.”

    Even as they document cases of violence in early states, archaeologists are hesitant to generalize about long-term trends because archaeology can provide only glimpses of the past. For example, ancient texts describe bloody battles in the Near East in the 2nd millennium B.C.E., but archaeologists have found few sites to support the textual history. That's why single finds, such as the one at Tell Brak, can quickly rewrite old views—but should be considered cautiously, researchers say. “Serendipitous data discovery in archaeology frequently refutes statistical laws,” says Yale University archaeologist Harvey Weiss.

    Instead of trends in rates of violence, some researchers focus on how it assumed new forms, such as institutional slavery and human sacrifice, that are not seen in simpler societies. “Warfare and slavery go hand in hand” in the ancient world, Nolan says. Most slaves, he says, were captives of war, often the wives and children of slain soldiers. Brutal human sacrifice also appears as early states consolidate and display their power (see p. 834).

    Many complex societies quickly developed moral codes and written laws designed to protect the young, the poor, and the defenseless. But they also found a galaxy of reasons to punish nonviolent behavior with violence, as U.S. sociologist Steven Spitzer and French philosopher Michel Foucault have noted. Sexual deviance, religious heresy, and betrayal of the state all could be punished with tortures and extended imprisonment undreamed of by our ancestors in simpler societies. The rise of civilization was indeed a double-edged sword.

  14. The Ultimate Sacrifice

    1. Ann Gibbons*

    Seeking to impress both gods and humans, early state societies across the globe displayed their power by ritually killing human victims.

    Royal prerogative.

    An artist's impression of the death scene in a royal tomb at Ur from The Illustrated London News in 1928.


    The 63 skeletons were arranged in the sealed death pit like actors on an eerie stage set. Just outside the King's Grave, archaeologists found six soldiers lined up, still wearing helmets and “guarding” the royal tomb. Beside them were two ox-drawn carts with drivers, grooms, and oxen lying nearby. Rows of men and women lined the passage to the tomb, and courtesans with elaborate golden headdresses sat in a circle around a set of musical instruments. This was a “theatre of public cruelty,” enacted at the death of a Sumerian ruler about 4500 years ago in ancient Mesopotamia, according to an initial report by Leonard Woolley, the British archaeologist who excavated the royal tombs of Ur in Iraq in the 1920s and 1930s.

    Woolley concluded in 1934 that these courtesans and servants had drunk some “deadly or soporific drug” from cups and a large copper cauldron he found in the pit. Most scholars accepted his account that the victims had gone willingly to their deaths, to serve their ruler in the netherworld.

    So, when three researchers at the University of Pennsylvania (Penn) Museum of Archaeology and Anthropology in Philadelphia took their first look recently at computed tomography scans of two skulls from the death pit, they got a big surprise. “Holy cow!” said paleoanthropologist Janet Monge when she saw unmistakable radiating fractures from a blow to the side of a skull. This wasn't a case of mass suicide à la Jim Jones, but the ritual murder, or sacrifice, of 63 humans.

    The Penn team proposed in a report in Antiquity last year that the retainers “were felled with a sharp instrument, heated, embalmed with mercury, dressed and [only then] laid ceremonially in rows.” The ornaments and helmets had obscured the damage from the mortal blows for decades.

    This new look at the victims of Ur is one of a flurry of multidisciplinary studies that has recently documented a macabre trail of human sacrifice that leads to every corner of the world, from the death pits of Ur and China to burials atop the highest peaks of the Andes. Using rigorous forensic and bioarchaeological methods, researchers have been able to reconstruct victims' last days and hours, and sometimes their identities, testing controversial claims of human sacrifice. “This is an exciting time for this kind of research,” says biological anthropologist John Verano of Tulane University in New Orleans.

    Researchers are finding that although human sacrifice was not frequent in most cultures, it was pervasive, taking place at one time or another in just about every ancient civilization in which someone had the rank and power to decide who died, Verano says. Although human sacrifice was seen as barbaric by classical times, it persisted in Rome, the Americas, and elsewhere until the rise of Judeo-Christian and Islamic religions that condemned it. Across cultures, most cases shared twin motivations: to please the gods, and to vividly assert and display rulers' power. For early states, whose rulers were consolidating power, ritual sacrifice seems to have been one way to discourage outside attacks and internal revolt by sowing fear. The cross-cultural data are beginning to give researchers an idea of “key patterns in the origins, motivation, and methods of [sacrifice],” says bioarchaeologist Haagen Klaus of Utah Valley University in Orem.

    Myth or reality?

    It doesn't take a scholar to guess from the friezes of Roman temples, images on Maya pots, or scenes in ancient Greek plays that our ancestors might have sacrificed one another. Historical accounts—from Herodotus in Greece and Pliny the Elder in Rome to Spanish priests in the Americas—recount sacrifices made by the Scythians, Etruscans, Romans, Incas, Aztecs, and Norse. Engraved labels on ancient Egyptian jars suggest that some early rulers took servants and concubines with them to the next world. Art on ceramic urns show a Maya god “sitting down to a plate of human hearts, just like a Maya king would eat a plate of tamales,” says bioarchaeologist Andrew Scherer of Brown University.

    Although depictions of ritualistic decapitation and dismemberment are found in the art and literature of many societies, convincing physical evidence has been rare until recently. “People didn't really look at marks of perimortem violence, so they didn't see the evidence,” says bioarchaeologist Vera Tiesler Blos of the Autonomous University of Yucatán in Mexico.


    Video Lab

    This led some researchers to challenge the claims for human sacrifice in general and in Mesoamerica in particular. “I don't think that what we say is human sacrifice is anything other than [deaths in] war,” says archaeologist Elizabeth Graham of University College London, who studies Maya sites in Belize. She notes that victims are often captives taken in war. “All societies have socially sanctioned killing,” she says, citing the Holocaust of Germany as a particularly grievous recent example. “The poor Aztecs have been made out to be the most brutal people in the world, but if it's actually warfare, they killed few people.”

    Researchers agree that iconography and texts alone can't confirm sacrifice. For example, ethnographic accounts claim that the Aztecs slaughtered 80,000 war captives when dedicating the Great Pyramid of Tenochtitlan in 1487, but this is widely considered an exaggeration. However, in the past 15 years, researchers at Aztec sites have excavated sacrificial knives and stones, some with traces of human blood, as well as bones with cut marks and signs of heart extraction. This has “led us to conclude without a doubt that human sacrifice was a basic practice of Aztec religion,” says Leonardo López Luján of the Templo Mayor Museum at the National Institute of Anthropology and History in Mexico City. At Tenochtitlan, his team found 47 decapitated bodies and 42 children with slit throats.

    Death of a child.

    The Inca cut open this Muchik child's chest and removed the heart about 500 years ago.


    So many new cases of sacrifice have been documented in the past decade that researchers classify them informally. There are retainer burials where slaves die with their owners; offerings of prized children; dedicatory burials that are a sort of bloody feng shui to bless buildings, such as Tenochtitlan during construction; and ritual killings of captives from war.

    Feast for the gods.

    The Maya offered bowls with the heads, teeth, and bodies of children and adults at the El Diablo pyramid in Guatemala.


    The difference between these deaths and other state-sanctioned killings is that sacrifice is ritualistic. Researchers add that they aren't targeting any particular society; indeed, a major finding is that human sacrifice was found in most emerging city-states around the world, particularly under a new ruler or in times of crisis. At the same time, it was relatively rare within populations. “Not everyone gets a sacrifice at their funeral,” Scherer says. Klaus agrees: “There's not a lot of trauma in the populations at large. But a special subset of people did die extremely brutal and violent deaths at a variety of sites.”

    Loyal subjects

    Retainer sacrifice, as at Ur, was apparently performed so that rulers could live in the afterlife much as they did in life, and to demonstrate their importance to the living. “It's not a sacrifice in the sense of slaughtering a cow or offering meat” to a god, says Penn archaeologist Richard Zettler. At Ur, the court attendants were set up as though they were at a banquet with food, drink, and music. They were adorned in golden wreaths studded with lapis lazuli and carnelian.

    But did those who died really play the roles of guards, grooms, and courtesans in life? Strontium isotopes in bones and teeth show that two retainers at Ur were born locally and were not foreign captives, suggesting that they were indeed servants, says Penn archaeologist Aubrey Baadsgaard.

    Such extravagant retainer sacrifices were rare. At Ur, the practice appears in only 16 out of about 2000 graves unearthed in the Royal Cemetery. But it also occurs in Egypt, at the tomb of King Aha in Abydos, in 2900 B.C.E.; and in China in the 2nd millennium B.C.E. when kingship had just been established, says archaeologist Glenn Schwartz of Johns Hopkins University in Baltimore, Maryland. “When you establish a new kingdom, a new kind of political organization with a ruler at the top, very often there is this strategy of making a big show of the power of this new social order by having this kind of retainer sacrifice,” Schwartz says.

    At Ur, the number of sacrificial victims and wealth of the treasures declines from about 2600 B.C.E. to 2450 B.C.E. The practice also declines and then vanishes in Egypt, perhaps because it was too costly to bury such wealth, both in objects and human life, or because established kings didn't need such a conspicuous display of power.

    The Maya also practiced a form of retainer sacrifice in which some victims were children. In 2010, Brown University archaeologist Stephen Houston and his colleagues found six blood-red cache vessels beside a king's body in an airtight chamber of the El Diablo pyramid in the jungle near El Zotz, Guatemala. Interred in about 350 C.E., the caches contained the heads, teeth, and bodies of six children, aged 6 months to 5 years. The smallest were stuffed in the bowls whole, but the older children had been dismembered. Several had been ritually burned around the face and chest with low heat.

    This matches previously known Maya iconography, showing children burning in large bowls with their hearts cut out, Scherer says. Such sacrifices “don't seem to have anything to do with warfare,” Scherer says. “The Maya are replicating myths, with scenes of child sacrifice to the maize god.”

    Mountaintop maiden.

    This 15-year-old Inca girl was sacrificed atop Volcán Llullaillaco in Argentina 500 years ago.


    Sacrificial lambs

    Children were victims in other cultures, too, perhaps because they are often seen as the most precious offering. The Inca, for example, built platforms high in the southern Andes, where they held mountaintop ceremonies called capacocha, in which they sacrificed beautiful, unblemished children.

    For example, a 15-year-old girl called the Llullaillaco Maiden was discovered in 1999 with a 7-year-old boy and a 6-year-old girl atop the 6739-meter-elevation Volcán Llullaillaco in northwest Argentina ( The children were buried about 500 years ago, with gold and silver figurines. Two had headdresses of white feathers; one, a silver bracelet. Their youth and rich gifts suggest they were not captives of war, says archaeologist Johan Reinhard of the National Geographic Society in Washington, D.C.

    The children were apparently treated well, consistent with ethnographic records suggesting that it was an honor to be chosen for this sacrifice. Stable isotopes from the Maiden's hair showed that her diet changed dramatically about a year before death, from a peasant's diet to one suddenly rich in meat and maize, an elite food; her diet shifted to more grains a few months before her death, as she trekked to the peak. “Children were specially selected and treated royally perhaps a year before they were taken up to the mountaintop,” Verano says.

    Child “sacrifice doesn't mean giving up those you don't like,” Klaus says. “It's giving up those that matter the most.” And the Inca may not have thought of their children as dying. “In this very sacred mountain environment, they'd be seen as living with the gods,” Reinhard says. “They would, in essence, become deified.”

    And yet even Inca priests may have had an eye to impressing other humans as well as the gods, says bioarchaeologist Tiffiny Tung of Vanderbilt University in Nashville. The feting of the children en route to the peaks and the hubris of staging sacrifices at such lofty heights would have inspired awe and fear, helping the Inca assert power over their vast empire, Reinhard says.

    Trophy head.

    The Wari of Peru made a trophy of this captive foreigner's head in a highly ritualized process.


    Captive audience

    There are many instances of reverential child sacrifice, but researchers agree that killing captives after battle may have been the most common kind of human sacrifice. Performed by cultures as diverse as the Aztec, the Wari, and the Shang Dynasty in China, this practice involves more than merely disposing of captives, those who study it insist. It is designed to shock and awe both enemies and subjects.

    For example, in the 1980s, Chinese researchers uncovered 14 skulls in a row, including one placed inside a bronze food steamer, in the royal cemetery of Anyang, the capital of the ancient Shang dynasty in east-central China. The researchers assumed the skull fell into the pot by accident. Then in 1999, another skull turned up in a steamer in a tomb in a later Shang capital, according to archaeologist Tang Jigen of China's Academy of Social Sciences. This “leads us to the inescapable conclusion that the Shang people did indeed have the cruel custom of steaming human heads,” he said in a recent publication.

    Anyang fits the profile of cultures that sacrifice captives: At about 1200 B.C.E., it was the center of the country's first expansive power. Archaeologists found up to 15,000 sacrificial victims during digs in the 1930s and 1950s, and are now examining them in detail. Most are men of military age who were decapitated, Jigen says.

    The men's arms and legs were frequently cut off in similar ways, suggesting they were killed ritually rather than in battle, and the human remains are mixed with animal bones. Few pits contain goods such as pottery that are included in typical burials. Shang oracle bones provide hints of sacrificial procedures: One inscription made on a defleshed skull mentions the decapitation of an enemy leader.

    The deaths may mark the ritual killing of war captives in order to provide food or slaves to ancestors, says archaeologist Roderick Campbell of New York University, noting that the pits often contain remains of cattle, dogs, grain, wine, and other material commonly used in sacrifices. Later dynasties did not continue the penchant for sacrifice, about which later Chinese annals are silent.

    Halfway around the world, the iconography of the Moche of northern Peru also suggests brutal sacrifice of war captives, done in ways that highlight the victor's power. Images show captives being paraded naked with bloody noses before a warrior priest and having their throats slit. But until the 1990s, some researchers thought that such scenes depicted Moche mythology or staged drama, not reality.

    Baptism by fire.

    The Maya offered babies to their gods, as shown in this mythological scene.


    The overwhelming bioarchaeological evidence of hundreds of sacrificial victims, gathered since the 1990s, contradicts that view, Verano says. For example, the remains of more than 100 young men in a Moche plaza at the pyramid of Huaca de la Luna were either left exposed on the surface to be buried by windblown sand, or were incorporated in the fill of plazas during their construction around 500 C.E., Verano says. Analysis of the remains suggests these victims were captives brought back from battle, as they had wounds that had partially healed. Patterns of cut marks on the neck vertebrae and other bones confirm they were decapitated and that their bodies were defleshed. This was, Verano says, a “prominent display of military victory.”

    Deep cuts.

    A captive Muchik sacrificed in Peru suffered a series of deep cut marks on his collar bone to extract his heart.


    The study of sacrifice is also illuminating the politics and social structure of ancient societies. For example, researchers knew that after the Moche collapsed, their descendants, the Muchik, were ruled by another culture, the Sicán. Both cultures practiced sacrifice. But the details suggest that the Sicán governed loosely, Klaus says, because the Muchik still killed victims, often children, the traditional Moche way: The children's throats were slit, their chests were cut open to remove hearts, and their bodies buried with long-standing Moche funerary rituals. “It's a Moche template,” not a Sicán one, Klaus says.

    In other cases, researchers are using the existence of human sacrifice to show that certain cultures were more organized and sophisticated than had been realized. For example, a few scholars have suggested that the Wari of central Peru were not a state-level society. But Tung says their practices of human sacrifice, found as early as 600 C.E. to 1000 C.E. at Conchopata, suggest state-level control and organization.

    Tung and Kelly Knudson of Arizona State University, Tempe, have analyzed stable isotopes in 72 Wari trophy heads, many of children, and buried bodies. Of 29 properly buried bodies, all belonged to local people. But almost all of the trophy heads came from foreigners. This suggests that they were captives, according to a report last year in the Journal of Anthropological Archaeology.

    Captives were brought alive to Conchopata, beheaded, and then processed into trophy heads in a “very systematic, very standardized way,” Tung says. “They clearly had a standardized tool kit for drilling holes on the top of the head for the cord, so the heads would be upright and facing forward when displayed.” This matches drawings on large ceramic urns, which show Wari warriors seizing prisoners and carrying trophy heads. “This is important, because it suggests you have Wari state structures used to promote this”—to coordinate the warriors, the priests who made the trophy heads, and the artists who depicted them on urns, Tung says.

    The practice suggests a state-level society asserting its absolute authority against outsiders, Tung says. “Sacrifice is very orchestrated—it's not just death on the battlefields. It's a performance to demonstrate to your internal community and outsiders your absolute power.”

    As more cases of sacrifice emerge, some defy classification. This suggests that researchers have just begun to exhume the myriad ways that humans killed each other in the name of the gods and the state. “Our ability to see sacrifice in the past was somewhat limited. Now we're able to expand that view,” Monge says. “I'd say we're just coming to realize in some measure the enormity of the violence of humans against humans.”

    • * With reporting by Andrew Lawler.

  15. In Battle

    Fighting Rituals

    1. Elizabeth Pennisi

    Humans aren't the only species to engage in ritual combat. Honeypot ants of the U.S. Southwest stage elaborate, ritualistic face-offs against nearby nests that can involve hundreds of ants and last for days.


    Bluster saves lives. Certain agricultural tribes in New Guinea are known for their “nothing fights,” in which armed warriors gather by day to yell and gesture but rarely come close enough for combat. Such shows of force can go on for days and weeks, with men heading home at night and taking mutually agreed-upon breaks. It's one of many ways people manage conflict so that few get hurt and détente is maintained.

    But it's not just humans who engage in ritual combat. Groups of birds known as green wood hoopoes engage in calling contests (see p. 828), and some species of ants settle their differences with mock tournaments. For example, honeypot ants of the U.S. Southwest stage elaborate, ritualistic face-offs against nearby nests, involving hundreds of ants and sometimes lasting for days. “Displays are such an important way of conveying information because it's a way of avoiding the risk of physical injury,” explains Andrew Radford, a behavioral ecologist at the University of Bristol in the United Kingdom. Group displays convey the group's ability to compete. That birds and ants mount these shows of force suggests a common evolutionary pressure to minimize actual losses and demonstrates that such behaviors can emerge without organized leaders or higher thought processes.

    Ants, which make war on each other and will fight to the death to defend nests and territories, are “the most aggressive animals,” says behavioral biologist Bert Hölldobler of Arizona State University, Tempe, who has studied these insects for decades. Yet the honeypot ants (Myrmecocystus mimicus) stage fights that don't kill. These ants, which are each less than a centimeter long, eat termites, dead insects, and honeydew, and earned their name because some colony members store food in their swollen abdomens, becoming living “honey pots.” The ants' foraging territories often overlap with those of nearby nests.

    In a ritual contest, two honeypot ants meet head-on and stiffen their legs as though on stilts. They may climb up onto a pebble, as though trying to appear larger. They hold their abdomens and their heads high and probe each other with their antennae. Ants from the same colony back off within a few seconds, whereas those from different colonies turn sideways to one another, lift the abdomen even higher, and point it at the opponent. They kick and drum their antennae on each other for up to 30 seconds, then disengage, uninjured, and move on to the next ant.

    Typically, the smaller ant yields to the larger ant; analyses of films of tournaments suggest that the ants try to fool one another into “thinking” they are larger, says Hölldobler. A single foreign ant can trigger the displays, which then grow as scouts recruit up to several hundred workers to the tournament.

    During tournaments, the ants appear to track encounters to get a sense of which side has more numbers, says Hölldobler. The recruiting scouts do this, never engaging an opponent for more than a few seconds and wandering throughout the tournament area.


    Video Lab

    In many ways, the displaying ants are the equivalent of ritualized antler butting in male deer. These bouts reduce the chances that rivals will be hurt or killed in disputes. That makes sense from an evolutionary perspective, because in a real fight, even the stronger opponent risks injury. For example, in another species of ant, Australia's meat ant (Iridomyrmex purpureus), displaying ants line the boundaries of each colony's territory and help maintain a stable border at a minimal cost to the colony, says evolutionary biologist Ellen van Wilgenburg of the University of Melbourne in Australia. “If most confrontations between non-nestmates involved potentially fatal fighting, then the conflict zones would act as a significant drain on the workforce,” she notes. In 2005, she and her colleagues showed that if these displays escalated into real fights, resident ants tended to be the aggressor, most likely because they had more to lose.

    In the honeypot ant contests, if one group outnumbers the other, the tournament shifts ever closer to the smaller group's nest, interfering ever more with foraging by its workers. If the contest is uneven enough, the smaller group retreats and closes off the nest entrance, hiding out until the rivals have gone back to their own nests. Sometimes the retreat isn't fast enough, and the larger group raids the nest of the smaller one, carrying off the young, stealing the honey pots, and killing or running off the queens.

    The same breakdown can happen in human “nothing fights.” As University of Oxford archaeologist Barry Cunliffe wrote in 2006 in the book Conflict, if one side fails to show up, a rout can ensue. And the losers wind up abandoning their villages and sometimes being killed.

  16. Gender and Violence

    1. Mara Hvistendahl

    Researchers are probing links between the status of women in a society and its propensity toward war.

    When Erik Melander first encountered a series of papers linking gender inequality and war, the premise struck him as fishy. Melander had trained as a conflict scholar in the 1990s, when the forces underlying war were relatively well established: a lack of democracy, a low level of economic development, and the presence of nationalism. The status of women wasn't even on the list.

    So he was not convinced when, in 2000, he read the first in a group of studies by political scientist Mary Caprioli of the University of Minnesota, Duluth, that challenged some closely held views about violence and war by connecting the low position of women to conflicts from international aggression to civil war. The idea that the status of women helped predict a state's volatility sounded “like wishful thinking,” recalls Melander, deputy director of the Uppsala Conflict Data Program at Uppsala University in Sweden. Part of the problem was what he assumed was a simplistic approach to sex differences underlying the connection. The notion that women are biologically so hard-wired for peace that simply giving them more say in international affairs yielded tranquility seemed suspect. “I expected that when controlling for other factors,” he says, “any relationship would go away, or the effects of gender equality would be very small.”

    Then he tested the hypothesis himself. Melander checked the claim that gender inequality correlated with escalated levels of conflict within states. Measuring the status of women by looking at the sex of a country's highest leader, the proportion of women in the legislature, and the ratio of women to men who receive higher education, he controlled for factors like democracy, economic development, and the time since a country's last civil war. To his surprise, the results, published in International Studies Quarterly in 2005, confirmed the finding that had aroused his suspicion. In a second study published that same year, he broadened the picture: States where women were oppressed also had higher rates of political imprisonments, killings, and disappearances.

    Melander's research is among a nascent body of work in international relations showing gender inequality to be an important security barometer. By focusing on gender inequality rather than biological sex differences, these researchers say they have identified a previously overlooked trigger of conflict. Causality is far from proven, however, and some critics say that gender inequality could be a proxy for other underlying causes. Caprioli, whose work first piqued Melander's interest, is in some ways the ringleader of the new group. From 2000 to 2006, she published a series of widely cited statistical analyses linking the low status of women to a host of negative phenomena. After controlling for other factors, she found states where women are treated poorly are more likely to become embroiled in disputes with other states, more likely to turn to violence in those disputes, and more likely to erupt into civil war.

    Dovish females?

    Many scholars contend that women are not inherently more peaceful. Margaret Thatcher led the United Kingdom into war.


    Still, she recalls a reviewer writing on one of her first published papers: “I don't recognize this as research.” She says: “It was very hard in the beginning because I was pioneering a new field of study.”

    Born warriors?

    Throughout human history, males have been the more violent sex. Skeletons unearthed from early human societies show more head injuries among males, and men are believed to have been the aggressors as well as the victims. Gender clearly matters in conflict. But sorting out just how and why it matters—and how significant a role biology and culture each play—has proven a thorny task.

    One of the first works to tackle that challenge was the 2001 book War and Gender, by political scientist and American University professor emeritus Joshua S. Goldstein. “That was a seminal piece of work,” says Ismene Gizelis, a political scientist at the University of Essex in the United Kingdom. “It brought the issue of gender into international relations and conflict studies.”

    The book grew out of a puzzle. If both gender roles and war practices vary widely from one culture to the next, Goldstein wondered, why was there not more variation when it came to gender roles in war? When he dug into research from biology, ethology, and anthropology, the obvious explanation—that biology turns women into doves and men into hawks—didn't completely explain the conundrum. To be sure, Goldstein determined that there are some core biological differences that affect war behavior. Childbirth and motherhood, for example, keep women away from the battlefield. But he contends that inborn traits are not as influential as commonly portrayed—and that biological factors interact with cultural ones.

    Testosterone levels, for example, can spike following shifts in a man's social status, such as getting married or winning a game. “Cultures are responsible for the exaggerated gender roles we see in societies,” Goldstein says. Melander agrees that although “there are evolutionary roots to the male warrior role,” a strictly biological view fails to “take into account the differences between, say, Sweden and Pakistan.”

    Risk factor.

    A high proportion of teen mothers can suggest low status for women, which correlates with violence and war.


    If a propensity toward fighting were inborn, men might disproportionately report more enthusiasm for war. But even as Goldstein was researching War and Gender, studies were showing that men and women are often remarkably similar in their views on war. In the 1990s, for example, political scientist Mark Tessler and demographer Ina Warriner, then at the University of Wisconsin, Milwaukee, looked at attitudes toward conflict in the Middle East. They relied on surveys from Israel, Egypt, Palestine, and Kuwait in which participants had been asked questions such as, “Do you believe that the Arab-Israeli conflict can be solved by diplomacy or is a military solution required?” They found that men and women barely differed in their answers.

    But when Tessler and Warriner shifted their focus away from an individual's gender toward his or her perspective on gender equality, the picture changed. Along with questions about international aggression, the Middle Eastern survey participants answered questions such as, “Do you think it is more important for a boy to go to school than a girl?” Comparing the responses, the scholars found a strong association between sexism and bellicosity. “You can predict how a person stands on war and peace and on the Arab-Israeli issue based on what they think about gender equality,” Tessler says.

    Conflict scholars who focus on gender inequality believe values governing how states behave abroad reflect values within a society. “If you respect women, you also respect the rights of others,” Gizelis says. “As a result, you also deal with conflict in a different way.” Political scientist and University of Maryland, College Park, professor emeritus Ted Robert Gurr has found the same to be true of ethnic discrimination: Societies in which minorities suffer widespread discrimination are also more volatile.

    When war does break out, moreover, equal societies tend to be better at restoring peace. In 2009, Gizelis examined 124 civil wars from 1945 to 2000. Evaluating the status of women using indicators such as female-to-male life expectancy ratio and secondary school enrollment ratio, she found that U.N. peacekeeping operations were far more likely to succeed in states where men and women were relatively equal before the war.

    Essential ABCs.

    War-torn societies where girls leave school prematurely are less likely to restore peace.


    Some now hold up this body of work as evidence that raising the status of women is not just a good in itself: It can also help ensure global security. That idea is percolating through policy circles. In a 2005 speech on the empowerment of women and girls, U.N. Secretary-General Kofi Annan said, “I would venture that no policy is more important in preventing conflict, or in achieving reconciliation after a conflict has ended.”

    Caprioli is more reserved: “This link is but a piece of the puzzle,” she says. But, she adds, if promoting gender equality “decreases war by any percent, I would argue it is a change well worth making.”

    Others warn that correlation does not equal causation. Goldstein believes there is a link between gender inequality and war but says, “The big question is the direction of causality. Does gender inequality make societies more war-prone, or does being involved in lots of warfare exaggerate gender inequality?” Women, indeed, are among the most affected victims of war.

    Yale University political scientist Nicholas Sambanis cautions that the relationship might be explained away by a third factor. He ventures that the level of gender inequality might be “a proxy for other, more fundamental things, like cultural differences, rule of law, [and] institutional development.”

    No one has yet proven the existence of such a third factor. And Melander counters that causality is also difficult to show with other, more established explanations of conflict, such as a low level of economic development or an undemocratic regime.

    Better data may eventually help scholars unravel the connection between gender inequality and war. The World Economic Forum's Global Gender Gap Report and the U.N.'s Women's Indicators and Statistics Database both compile statistics on the status of women, such as the proportion of female professional and technical workers in a country and the percentage of women who have given birth by age 20. But because many of those statistics were introduced only in the last decade, scholars comparing conflict across time are limited to indicators for which good longitudinal data exist—and there are problems with all of them. The presence of a female leader can be misleading, because many women rise to power through dynasties—think Indira Gandhi or Benazir Bhutto. And while a low fertility rate, an indicator used by Caprioli in several papers, often suggests a higher status for women, there are notable exceptions.

    In the meantime, conflict scholars who are focused on gender inequality say the largest issue is being taken seriously. Some investigations of war looking at dozens of variables disregard gender inequality entirely. “Gender-related approaches still are not well integrated into the mainstream of conflict studies,” Goldstein says. “It's more like two camps living under the same roof but barely talking.” Melander contends that if those ignoring the research were to instead test the hypothesis, as he did, they might change their minds.

  17. In Battle

    From War to Peace

    1. Elizabeth Pennisi

    The ability to reconcile differences after a spat was once considered a uniquely human trait, but researchers have since observed it in more than 30 primate species—and even among birds.


    Kiss and make up. Animals from dung beetles to chimpanzees fight each other, but the ability to reconcile differences after a spat was once considered a uniquely human trait. Then in the 1970s while observing chimpanzees at a zoo in the Netherlands, ethologist Frans de Waal witnessed a male attacking a female. Immediately, other males came to her rescue, and soon afterward, amid hooting by the troop, the male and female embraced. In 1979, he published a paper documenting that after a fight, a whole lot of kissing, embracing, and holding hands went on between chimp opponents. De Waal, now at Emory University in Atlanta, called this behavior reconciliation, and researchers have since observed it in more than 30 primate species.

    Like people, animals that live and work together do better if they get along. And that means smoothing over differences, particularly between individuals who have long-standing, valuable relationships. “In the last 30 years, we have learned a lot about conflict management in primates,” says Filippo Aureli, now at the University of Veracruz in Mexico. “It is important to verify if animals other than primates behave in a similar way.” Finding these behaviors in other parts of the animal kingdom sheds light on their evolution and cognitive basis.

    Researchers first turned their attention from chimps to other mammals. A 1998 study found that goats that have tussled over food later nuzzle and groom each other. Hostile dolphins rub each other gently with their flippers or tow one another by a flipper to reestablish alliances after a fight. In 2010, Alessandro Cozzi of the Phérosynthèse Research Institute in Semiochemistry and Applied Ethology in Saint Saturnin Les Apt, France, showed that horses, too, make up after aggressive encounters by grooming, sniffing, playing with, or following one another. Studies in canines and hyenas have also found evidence of postconflict management strategies.

    Nicola Clayton, an ornithologist at the University of Cambridge in the United Kingdom, and her colleagues wanted to look beyond mammals. Birds and mammals “have very different evolutionary histories,” notes Orlaith Fraser, a cognitive biologist at the University of Vienna. If they shared these traits, that would suggest convergent evolution and might point to the circumstances under which these strategies evolve.

    In 2007, Clayton's team studied postconflict behavior in a group of 10 captive rooks, highly social, intelligent birds related to crows and ravens. They watched the freeflying but captive birds in their large cage each afternoon and took note of when one displaced another or showed other signs of aggression. For the next 10 minutes, they recorded whether either the opponents or another bird made any peaceful gestures, say, touching beaks or sharing food. They were looking not only for reconciliation but also for “consolation,” actions by an uninvolved bird to soothe the victim.

    They saw no evidence of reconciliation, which they interpreted as a sign that the relationships between opponents weren't very important. But they did find that the mates of victimized birds would try to console the loser, which makes sense, as rooks pair for life early on and have a lot invested in each other.

    Meanwhile, Thomas Bugnyar, now at the University of Vienna, had been following up on a pilot study in another species of bird, ravens, which suggested that these birds do smooth ruffled feathers after fights. In 2004, he monitored postconflict activity in a group of hand-raised ravens. Unlike rooks, young ravens can spend several years single, during which time they may develop friendships with other birds that can help them fight for dominance and access to food. Among these birds, he and Fraser often saw victims sidling up to bystanders or, after a particularly long chase or contentious encounter, bystanders offering consolation to losers. Typically, the bystanders were “friends” of the victim—they had hung out together, groomed each other, and shared information and food—Fraser and Bugnyar reported 12 May 2010 in PLoS ONE. They saw hints of reconciliation, but nothing significant.

    However, when they took a close look at the data from the pilot study of seven ravens studied in 2002, Fraser and Bugnyar did find evidence that opponents made up after fighting—if they had been in cahoots beforehand. “The key seems to be that there are strong social relationships,” Bugnyar says. As he and Fraser reported 25 March 2011 in PLoS ONE, reconciliation in these cases was worth the risk that a second fight might break out with renewed contact. Indeed, reconciliation reduced subsequent aggression. “The results are very similar to the results you get in primates,” Bugnyar says.

    Preliminary studies in ravens in a natural setting suggest that what happens in captivity happens in the wild, as well. “These behaviors are more widespread than we thought,” Fraser says. Like primates, the birds “have similar needs and have the same solutions to the same problems.”

  18. Drone Wars

    1. Greg Miller

    Are remotely piloted aircraft changing the nature of war?

    New flight plan.

    The U.S. Air Force now trains more pilots for crewless aircraft like the MQ-1 Predator.


    The video is slightly grainy, but it's easy enough to make out a white car parked on a dirt road and a person nearby setting up what looks like a tube. The video, shot in Afghanistan, comes from a remotely piloted aircraft (RPA), or drone, that's been following two men. Good intelligence suggests they are planning a mortar attack on a coalition airbase. The man in the video drops something into the tube and ducks for cover. Smoke shoots from the tube as a round is fired. The video jumps ahead to the man and his accomplice driving away. Small crosshairs superimposed on the video never waver from the car as it follows a riverside road. Suddenly there's a bright flash and a cloud of black smoke, the impact of a laser-guided missile launched from the drone.

    What happens next is just as important, according to Colonel James Bitzes, a legal adviser to the U.S. Air Force who showed the video at a conference last year at the New America Foundation, a think tank in Washington, D.C. (Bitzes's presentation is online at; the drone strike video begins around 38:20). People gather around the smoldering car, and some of them begin tossing objects from the trunk into the river. Another car pulls up and someone dumps an object from the trunk of that car, too. An Army team later recovered the objects—weapons—from the river, Bitzes said. “With the benefit of the RPAs, … there's a possibility of gathering additional intelligence,” he said.

    Given the potential to track down bad guys and kill them without risking American lives, it's no wonder the U.S. Department of Defense plans to spend $30.8 billion on developing and acquiring RPAs between 2011 and 2015. Dozens of other countries have or are pursuing drones of their own. Thanks to improvements in artificial intelligence, these machines will become more capable of making decisions, including, perhaps, whether to kill humans. As the technology zooms forward, experts are scrambling to catch up with the psychological, ethical, legal, and policy implications for 21st century conflict.

    Killing at a distance

    Not everyone agrees that RPAs represent a fundamental shift in military technology. “You see people breathlessly saying this is a revolution in warfare, but I think that's overstating it,” says Werner Dahm, chief scientific officer for the Air Force between 2008 and 2010 and current director of the Security & Defense Systems Initiative at Arizona State University, Tempe. “There's been a continuing progression in technology ever since the stone-throwing days. I think it's more like the sling is to the hand-thrown stone.” Dahm notes that the U.S. Air Force and intelligence agencies have been developing drones since World War II.

    All in a day's work.

    Waging war from halfway across the globe, RPA operators may not face the same combat stress as deployed troops.


    But unlike most of their predecessors, today's drones are operated half a world away from the war zone. “It's making the decisions even more remote from the action,” says Peter Hancock, a human factors researcher at the University of Central Florida in Orlando. “We really don't know what all the ramifications of that are.”

    There's little doubt drones are changing the experience of combat for those who operate them. Operators work at U.S. bases in dark, cool rooms designed more for protecting electronics than for ensuring human comfort, and they communicate with troops on the ground and intelligence analysts across the United States via Internet chat rooms. “Most of the time, I get to fight the war and go home and see the wife and kids at night,” one operator told Peter Singer, who directs the 21st Century Defense Initiative at the Brookings Institution in Washington, D.C. The quote comes from Singer's 2009 book Wired for War.

    As Singer and others have pointed out, drones raise questions about the psychology of killing. On one hand, the physical distance between the person pulling the trigger (or pushing the button) and the target has never been greater. In his influential 1995 book, On Killing, former Army Ranger Lieutenant Colonel Dave Grossman argues that killing at a distance is easier, psychologically, than killing at close range. Drawing on firsthand accounts gleaned from historical records and his own interviews with soldiers, Grossman writes of a human resistance to killing that increases with proximity: “This process culminates at the close end of the spectrum, when the resistance to bayoneting or stabbing becomes tremendously intense, and killing with the bare hands … becomes almost unthinkable.”

    On the other hand, the drones' cameras and other sensors may close the psychological distance. Operators track targets for hours or even days. They watch targets do bad things, but they also see them go home and play with their kids, says Colonel Hernando Ortega, a flight surgeon at the Air Force Intelligence, Surveillance and Reconnaissance Agency at Lackland Air Force Base in Texas. “It can be very personal,” Ortega says.

    The psychological impact of killing in combat—from a distance or otherwise—is a neglected and somewhat taboo area of research, says Shira Maguen, a psychologist at the San Francisco Veterans Affairs Medical Center in California. Since 2009, Maguen and her colleagues have published a series of studies of veterans of the Vietnam, Gulf, and Iraq wars that found that soldiers who report killing in battle suffer more symptoms of post-traumatic stress disorder (PTSD) than do those who saw similar levels of combat but did not report killing. Alcohol abuse and issues with anger and violent behavior also appear to be more common in those who've killed. Maguen says her work with deployed troops suggests that those who kill in self-defense seem to cope better. “With the drone pilots, they're not experiencing any personal threat, and I don't think we understand how that's important.”

    One of the first efforts to examine the psychological health of drone operators was led by Colonel Kent McDonald and Wayne Chappelle, who head the neuro psychiatry and aerospace psychology departments, respectively, at the Air Force School of Aerospace Medicine at Wright-Patterson Air Force Base in Ohio. In surveys given to 874 RPA operators and 628 airmen in other logistics and support work at the same bases, they found higher levels of stress and job-related fatigue in the RPA operators, the vast majority of whom worked on unarmed RPAs used for surveillance. The main causes of stress were long hours, rotating shifts, and the nature of the job, which entails hours of monotony punctuated by brief flurries of intense activity.

    Only 4% of drone operators in McDonald and Chappelle's study screened positive for heightened risk of PTSD. That's far lower than the 12% to 17% estimated prevalence of PTSD in veterans of the Afghanistan and Iraq wars, and comparable to the prevalence of PTSD in the general population. Only a small subset of these RPA operators participated in combat missions, and McDonald and Chappelle plan to investigate whether combat experience puts operators at greater risk. Ortega says he's not surprised by these findings. He attributes the apparently low risk of PTSD in RPA operators to a combination of training that mentally toughens all service members and the fact that even RPA operators on combat missions don't experience the same fight-or-flight activation of the sympathetic nervous system as soldiers on the battlefield.

    Drone diversity.

    Smaller RPAs like the RQ-11 Raven (top) can be deployed in the field, while the MQ-9 Reaper (bottom) can cover long distances.


    Robot arms control

    Removing the combatant from the war zone carries ethical and legal implications, says Armin Krishnan, a political scientist specializing in international security at the University of Texas, El Paso. “Targeted killing [of specific individuals] is becoming easier with drones,” Krishnan says, because it reduces the risk of American casualties, and with it, the political risk of launching an attack. Krishnan sees the covert U.S. drone attacks as particularly problematic. (U.S. officials acknowledged conducting a drone campaign in Pakistan only recently, but it was long an open secret: The New America Foundation estimates that 296 drone strikes have killed between 1785 and 2771 people in Pakistan since 2004, the vast majority of them militants.) Little is known about how the Central Intellgence Agency determines who is a legitimate target, Krishnan says: “The question is, is killing always justified? There's no public accountability for that.” He and others worry that drones make it easier to go to war. In a January article in The New York Times, Brookings's Singer noted that Congress has never debated the drone campaign in Pakistan. “What troubles me … is how a new technology is short-circuiting the decision-making process for what used to be the most important choice a democracy could make,” he wrote.

    Another debate swirls around technical advances that are making weapons systems more autonomous. The touchiest issue is whether a machine should ever make the decision to kill. Ronald Arkin, a roboticist at the Georgia Institute of Technology in Atlanta, argues that in some cases, a robot might make better ethical decisions than a human soldier blinded by anger, frustration, or fear. As a proof of concept, he and colleagues developed a prototype “ethical governor” that could be built into an autonomous weapons system. This software would kick in once the system locks onto a target and is ready to fire, and it would abort any attack that violates international laws of war or U.S.-specific rules of engagement.

    Another program they developed, a “responsibility adviser,” would ensure that a human is ultimately responsible for every action executed by the system. Arkin stresses that he hopes autonomous weapons will never be used, but given the history of warfare, he thinks it's an inevitability that must be prepared for.

    Determining whether a given attack violates international law isn't nearly as hard as discriminating combatants from civilians and other protected individuals, says Noel Sharkey, an expert on artificial intelligence and robotics at the University of Sheffield in the United Kingdom. Unfortunately, Sharkey says, “we have nothing like this at the moment.” He and Arkin disagree about how long it will take to develop. “Ron says maybe in 20 years; I say more like 100 or so,” Sharkey says. In 2009, Sharkey and three academic colleagues founded the International Committee for Robot Arms Control, which hopes to promote international discussion, and ultimately a ban, on autonomous weapons. “If the technology is developed, I'm pretty certain it will be used, ready or not,” Sharkey says. “That's my real worry.”