News FocusProfile: John Shea

Archaeologist Hammers Away At 'Modern' Behavior

+ See all authors and affiliations

Science  08 Feb 2013:
Vol. 339, Issue 6120, pp. 642-643
DOI: 10.1126/science.339.6120.642

Stone tools suggest that the earliest modern humans were as smart as we are—they just had different problems to solve, an archaeologist argues.

STONY BROOK, NEW YORK—One day in the late 1980s, an alarmed secretary at Harvard University called campus police. An apparently crazed young man had cornered a deer in the courtyard of the university's Peabody Museum and was hurling spears at it. Once on the scene, however, the police established that the deer was dead when it arrived on campus, and that archaeology graduate student John Shea was simply doing research: He was trying to understand how striking a large mammal damaged stone points.

That episode, says then-fellow Harvard grad student Daniel Lieberman, was a prime example of "John Shea being John Shea." It demonstrates Shea's hands-on, take no prisoners approach to prehistory, says Lieberman, who is now a paleoanthropologist at Harvard. "John is always a little on the edge. … He doesn't just sit back in the lab and ponder the Paleolithic, he tries to understand it on its own terms."

Making his point.

John Shea thinks modern humans were cognitively advanced long before they etched this piece of ochre (inset) 77,000 years ago.

CREDITS: SARAH PILLIARD; (INSET) CHRISTOPHER HENSHILWOOD

Shea, now at New York's Stony Brook University, has remained on archaeology's cutting edge despite his reputation for occasional outrageousness. His stone tool studies have helped archaeologists identify how stone points were used, and he has documented the sophisticated toolmaking skills of the oldest known Homo sapiens. More recently, Shea has been doing his best to shake up human origins research with a radical proposal: That the idea of "behavioral modernity"—a term long used by scientists to describe behaviors such as the use of symbolism, art, and elaborate tools—should be thrown into the scrap heap.

Shea argues that the first H. sapiens about 200,000 years ago had cognition fully equal to ours today. Instead of studying how our species gradually acquired "modernity," he urges analyzing our "behavioral variability," or the number of different ways we adapted to changing conditions. "The concept of modernity has been like a security blanket," Shea says. "But it's a 19th century model, the idea that evolution is directional and ends with us modern humans. It's an embarrassment, and we don't need it anymore."

Although many researchers agree that the concept of modernity has lost much of its usefulness, many are less eager to embrace Shea's proposed alternative. "The traditional concept of behavioral modernity does need to be replaced," says archaeologist Curtis Marean of Arizona State University, Tempe, who calls Shea "a major figure" in the field. But "I am unconvinced that John's behavioral variability is the correct replacement."

A wild child

The young man who brought a woodland creature to a Harvard courtyard was himself a child of the forest. Shea grew up in eastern Massachusetts, the oldest of three sons born to working class parents of Irish and French ancestry, and spent his early years hiking and fishing in nearby woods. "Mom and Dad would have been perfectly happy if I worked in a factory," says Shea, whose stocky body seems ready-made for manual labor. But a local biology teacher, seeing his love of nature, encouraged him to go to college.

In high school, Shea became interested in flint knapping, and while an undergraduate at Boston University, he perfected his technique so that he could fashion a hand ax in a matter of minutes. "It was instant gratification, and you can do it outdoors."

Shea's proficiency at making stone tools spurred his success as an archaeologist, says his Stony Brook colleague John Fleagle. "He has a deep appreciation of the effort that goes into the creation of different kinds of tools, and he is less bound by abstract typology and closer to the perceptions of the prehistoric people who made them."

Indeed, it was in part his flint-knapping talents, Shea says, that won him a spot as a graduate student at Harvard. Yet Shea feels that he never really fit in there. "Academia requires you to think before you talk, whereas I talk before I think."

Shea's self-critique is shared by many colleagues, although not all view it negatively. "He often comes up with outlandish ideas and just blurts them out," says Harold Dibble, an archaeologist at the University of Pennsylvania. Dibble recalls one conference where graduate student Shea "shocked most of the archaeologists in the room" by arguing that impact fractures on a common type of stone artifact showed that they had been used as projectile points. That idea is now widely accepted. "This is a trait of some very intelligent people," Dibble says. "They are not afraid to be wrong and enjoy throwing out ideas and seeing what happens."

The road from modernity

The turning point in Shea's thinking about modernity came around 2002, when he, Fleagle, and others reopened excavations at Omo Kibish in Ethiopia. Back in the 1960s, Richard Leakey's team had found H. sapiens fossils at this site, and the new team redated those fossils to 195,000 years ago, making them the oldest known modern humans.

Many researchers had long perceived an apparent gap between when humans started to look modern in anatomy and when they started acting modern, as shown by their stone tools and other artifacts. But Shea was sitting at the site one day, looking at stone points the team had found, when he had an epiphany. The points "were very well made, nothing primitive about them at all—they were like what I would make to show off," he recalls. "Suddenly it hit me right in the head. These were people just like me. They just had different challenges to face." There was no sense trying to track these humans' progression to modernity, Shea says, because they already were modern.

To test this insight, Shea undertook a broad study of African stone tools. In a 2011 paper in Current Anthropology, entitled "Homo sapiens Is as Homo sapiens Was," Shea analyzed stone tools from 10 sites in Africa associated with either H. sapiens or its immediate ancestors, dated from 284,000 to less than 7000 years ago. If the modernity concept was correct, Shea argued, there should be significant behavioral differences over time, with younger sites having more types of stone tools—showing specific and flexible adaptations to the environment—as well as more sophisticated tools overall.

Instead, Shea found that with just one exception, the oldest modern humans in Africa used just as wide a variety of stone tools as later humans of the early Upper Paleolithic period—long considered the time when modern behavior began to flourish—whose tools fell into four widely accepted types.

Shea concluded that early H. sapiens were as cognitively advanced as those today. Differences in the most ancient artifacts did not reflect a different level of cognition in their makers, but simply the need to create objects to suit different environmental and social conditions. Indeed, fully modern people haven't always used "sophisticated" tools. For example, 40,000 years ago the first Australians used relatively simple tools compared with the spectacular artifacts of the European Upper Paleolithic (EUP) of the same time.

Shea noted that traditional definitions of "modernity" were biased by lists of artifacts characteristic of the EUP, such as stone blades, tools made of carved bone, personal ornaments, and cave paintings. He decided that the concept of "modernity" could no longer be used as a guide to understanding modern humans, who first emerged in Africa more than 150,000 years earlier.

Tooling up.

Shea bases his arguments on stone tools, such as these drawings of projectile points.

CREDIT: JOHN SHEA

Indeed, a decade before Shea's article, an influential paper had warned against defining "modernity" with EUP behaviors. In 2000, Sally McBrearty of the University of Connecticut, Storrs, and Alison Brooks of George Washington University in Washington, D.C., argued that modern human behavior had deep roots in Africa long before H. sapiens colonized Europe (Science, 15 February 2002, p. 1219). Other researchers have noted that symbolic and artistic behaviors flicker in and out of the record tens of thousands of years before becoming permanently established, perhaps because of demographic rather than cognitive factors (Science, 9 April 2010, p. 164).

But Shea's critique "is probably the most comprehensive to date," says archaeologist James O'Connell of the University of Utah in Salt Lake City. Shea argues that no one set of behaviors—such as art—proves advanced cognition. Thus, he remains unimpressed by discoveries that some others see as milestones of "modern" behavior, such as 77,000-year-old pieces of ochre with etched patterns from Blombos Cave in South Africa, sometimes heralded as the earliest art (Science, 11 January 2002, p. 247). Shea thinks the Blombos people created those etchings because that was their style, not because they suddenly had become able to think abstractly for the first time. "What, were the Blombos people retarded?" Shea asks rhetorically. "That's a pretty pessimistic view of early humans, that scratching a rock with a tic-tac-toe pattern is some kind of threshold of cognition."

Shea seems confident that his assault on behavioral modernity will end up killing off what he sees as a fatally flawed concept. "Sally and Alison led the way," he says of McBrearty and Brooks. "They wounded the beast; I'm cutting off its head and putting it on a stick."

But Brooks says that the concept still has advantages: "It implies an evolutionary trajectory, which variability does not." Thus, she sees key differences between the etched ochre at Blombos and the South African site of Pinnacle Point, 100,000 years earlier, which has ochre but no etchings.

Other researchers are giving Shea's new ideas qualified praise. "Unlike modernity, the concept of behavioral variability is quantifiable and amenable to statistical analysis," says archaeologist Metin Eren of the University of Kent in the United Kingdom. All the same, O'Connell says, the research community "will need to hear more before buying in." Dibble agrees: "The problem is how to test" Shea's idea.

Some testing has already begun. In a paper published last month in the Journal of Archaeological Science, Eren's team analyzed variability in the use of a specialized technique for producing stone flakes—placing a stone core on an anvil and striking it with a hammer instead of performing this task in midair—at the Mumba Rockshelter in Tanzania. The team found that the use of this technique came and went over time, and that it was more likely to have correlated with changing climate and demographic factors than any cognitive "revolutions."

Shea and Fleagle are planning more tests: They have just received grants to return to Omo Kibish to compare behavioral variability between older and younger archaeological layers, and hope to start next year. "We're focusing on the actual properties of the archaeological record," Shea says, rather than searching for elusive signs of modernity, "which is just a metaphorical construct."

Subjects

Navigate This Article