News this Week

Science  15 Feb 2002:
Vol. 295, Issue 5558, pp. 1206
  1. DATA SHARING

    DNA Sequencer Protests Being Scooped With His Own Data

    1. Eliot Marshall

    Following what he calls an “egregious” violation of scientific etiquette, a researcher has shut down a public Web site containing his team's raw sequence data for Giardia lamblia, a diarrhea-causing protozoan. Mitchell Sogin of the Marine Biological Laboratory (MBL) in Woods Hole, Massachusetts, says he blocked public access 2 weeks ago to this federally supported DNA Web site after discovering that a colleague had published a paper using MBL's sequence information. He intends to restore the site (www.mbl.edu/Giardia) after the rules have been clarified and the data edited.

    The dispute is the latest in a string of clashes between those who collect and those who interpret data, and it brings into focus some questions that have been festering in the genome community. Among them: How much control should DNA sequencers wield over the data they gather? And should they be forced to share preliminary results—as many are now required to do—before they publish their own analysis?

    In 1997, MBL and a group of collaborators*won a 5-year, $2.6 million award to sequence Giardia. The main sponsor, the National Institute of Allergy and Infectious Diseases (NIAID) in Bethesda, Maryland, requires that data be made public as the work progresses. MBL complied by routinely releasing unassembled, raw DNA sequence on its Web site. But MBL intended that its own team should be first to publish an analysis of Giardia's genome. They also asked that others use MBL's preliminary results only to develop reagents or explore mutually agreed-upon projects. These restrictions, Sogin says, “were posted prominently on our Web site.”

    “I was enraged,” Sogin says, when he learned of a paper by Hyman Hartman and Alexei Fedorov in the online early edition of the 5 February Proceedings of the National Academy of Sciences (PNAS). He claims that the authors used MBL's genome data to publish conclusions about Giardia's evolution. That is precisely what Sogin and his collaborative group had intended to do under their NIAID grant. “This was the intellectual rationale for the Giardia grant,” Sogin says.

    Hartman, an affiliate of the biology department at the Massachusetts Institute of Technology, has been researching the evolution of complex organisms (eukaryotes) since 1984. He denies that he violated MBL's conditions; he was taken completely by surprise by the “huge controversy.” He adds that he has known Sogin “for 20 years,” although he thinks Sogin “didn't take my work seriously”—at least, not until last month.

    Burned.

    Mitchell Sogin stopped releasing Giardia lamblia genome data after another researcher used the information for a paper.

    CREDITS: (TOP TO BOTTOM) PAULINE LIM/MBL; M. ABBEY/PHOTO RESEARCHERS

    In their PNAS paper, Hartman and Fedorov, a Harvard University biologist, argue that eukaryotes inherited some of their most interesting cell structures from a predecessor organism no longer found in nature. They call it a chronocyte and argue that it was distinct from bacteria and archaea. To support this idea, they examine lists of proteins encoded by genomes of yeast, microbes, and other creatures represented in public databases. Among others, they cite Sogin's Giardia data set—which, they note in their paper, they “downloaded” with MBL's help in February 2001. Hartman insists that he used the Giardia data only as “a filter” to subtract out proteins of primitive organisms. (Giardia branched off from other eukaryotes very early.) “I never analyzed [Sogin's] data,” he insists.

    Sogin nonetheless “went ballistic,” says Hartman, appealing to the editor of PNAS, Nicholas Cozzarelli, for a correction. But Cozzarelli declined to intervene. Sogin “wants to encourage people to use his data but not publish it without his permission,” says Cozzarelli, who thinks that Sogin is claiming too much control and that his position is “untenable.”

    Sogin is not alone in arguing that sequencers should have greater control over the use of unpublished raw data from their own labs. “It's a big concern,” says Claire Fraser, president of The Institute for Genomic Research in Rockville, Maryland, which is sequencing many microbial genomes. “We feel that a full download of a database” for publication without “intellectual input” from the DNA sequencers is “not appropriate.” Richard Hyman, a researcher at Stanford University's genome technology center, says he has contemplated suing people who use his lab's data in publications without permission. However, he's not convinced that Hartman's paper violated the rules.

    The researchers who benefit most from public access to genome data are the computer wizards of bioinformatics. They are sometimes seen as “parasites” because they rely on others for raw material, says Sean Eddy of Washington University in St. Louis, Missouri, a leader in this field. Eddy has championed free access to genome data collections in the past. But he says he has become more aware of the need to protect the publication rights of DNA sequencers. He even suggests that it may be time to “revisit the rules” that demand prompt public release of raw data, laid down during the heyday of the Human Genome Project.

    Such a reassessment may be about to happen. A National Academy of Sciences panel will soon begin an examination of the “responsibilities of authorship in the biological sciences.” Chaired by biologist Thomas Cech, president of the Howard Hughes Medical Institute, the group will hold its first public meeting on 25 February, with a goal of trying to figure out whether there should be “a single set of accepted standards” for data sharing.

    • *The Giardia lamblia genome project includes teams at MBL; the University of Texas, El Paso; the University of Arizona in Tucson; and the University of Illinois, Urbana-Champaign.

  2. INFECTIOUS DISEASES

    Researchers Crack Malaria Genome

    1. Martin Enserink,
    2. Elizabeth Pennisi

    LAS VEGAS, NEVADA—Genome scientists call it the toughest microbe they ever took on. But at a meeting*last Monday, an international group presented the long-awaited first glimpse of the full genome of Plasmodium falciparum, the most important malaria parasite. A project to sequence the genome, begun 6 years ago, is now almost finished, and the results could be published this summer.

    The announcement marks the beginning of what appears to be shaping up as a milestone year for malaria research. The same groups have now put their teeth into a handful of other Plasmodium species, and most could be done before the year is over. To top it off, another international consortium plans to publish the genome sequence of the most important malaria mosquito, Anopheles gambiae, also this year. And a new class of inexpensive drugs is showing promise in animal trials (see next story and p. 1311).

    Malaria researchers say the flood of new data is a welcome boost for their field. Most of the Plasmodium sequence has already been put in public databases and has stirred excitement among scientists, says Dyann Wirth, director of the Harvard Malaria Initiative: “It has really energized the field and brought it together.”

    Tragic toll.

    Malaria kills 1 million children a year, mostly in Africa.

    CREDIT: CHARLES HANLEY/AP

    The P. falciparum genome was cracked in a joint effort by the Sanger Centre in the U.K., The Institute for Genomic Research (TIGR) in Rockville, Maryland, the U.S. Naval Medical Research Center (NMRC) in Silver Spring, Maryland, and Stanford University. When they started in 1996, the teams thought the organism's 30-million-base-pair genome was too hefty to handle for a now-popular technique called whole-genome shotgun sequencing. Instead, they divided up the organism's 14 chromosomes and attacked each of those using the shotgun method.

    The project was a “real nightmare,” says TIGR's Malcolm Gardner, because adenine and thymine, two of the four building blocks of DNA, together make up 80% of the organism's genome. (The proportion is about 59% in humans.) That made it hard to clone Plasmodium's genetic material in bacteria and to sequence it, says Gardner.

    After developing new techniques and software to overcome those problems, the teams now have the sequence almost complete, with more than 10-fold coverage and just over 500 gaps left, most of them in three particularly difficult chromosomes. While some members are trying to close them, others are “working furiously” to annotate the more than 5600 candidate genes found by gene-finding software, Gardner says.

    Using the same techniques, the TIGR team has also sequenced the genome of Plasmodium yoelii, which causes malaria in rodents, with fivefold coverage; the group is also working on Plasmodium vivax, which causes a less severe form of human disease. The Sanger Centre, meanwhile, has taken on several other Plasmodium species. Those additional genomes will be particularly useful for researchers using animal models and should help identify what exactly makes P. falciparum such a killer to humans.

    Already, the raw data posted so far have been “tremendously helpful” for scientists searching for new drugs, says Stewart Shuman of the Sloan-Kettering Institute in New York City, one of many researchers who have dredged the data and come up with new drug targets. One group of German researchers not only found an enzyme that could make a good target, but also discovered that an existing antibiotic could block it, and clinical trials are being planned (Science, 3 September 1999, p. 1573).

    But new drugs, although desperately needed, won't win the war against malaria, cautions Stephen Hoffman, a veteran malaria researcher at Celera Genomics in Rockville, Maryland, and former head of the Malaria Department at NMRC, who's involved in both the Plasmodium and Anopheles sequencing projects. Existing drugs can cure malaria, Hoffman points out—but they are too expensive or difficult to get to the people who need them most, and drug resistance is an increasing problem. A vaccine holds the surest hope of preventing the more than 1 million malaria deaths each year, mostly of African children. That's why Hoffman urged his fellow researchers to find such a vaccine, using a concerted, systematic approach—much like the genome project itself.

    • *Second Conference on Microbial Genomes, Las Vegas, Nevada, 10-13 February.

  3. INFECTIOUS DISEASES

    Candidate Drug Breaks Down Malaria's Walls

    1. Gary Taubes

    The global picture of malaria is grim: Each year 300 million to 500 million people are infected and more than 1 million die, mostly children under the age of 5. Malaria parasites have become increasingly resistant to the most commonly used and least expensive antimalarial drugs. What's needed, experts say, are new potent and inexpensive drugs, ideally aimed at entirely new therapeutic targets that will make it that much harder for the parasites to acquire multidrug resistance.

    On page 1311, a team led by biochemist Henri Vial of the University of Montpellier II and the Centre National de la Recherche Scientifique in France reports on just such a new class of antimalarial drugs, one that has shown remarkable effectiveness in rodents and primates. Unlike most commonly used antimalarials, which target the parasites' hemoglobin metabolism or DNA synthesis, the new compound inhibits the parasites' ability to synthesize protective membranes while sequestered away within red blood cells. The compound “seems to be extremely potent and active against even multidrug- resistant strains of Plasmodium falciparum,” says David Fidock, a molecular parasitologist at the Albert Einstein College of Medicine in New York City.

    The compound, dubbed G25, aims at the third stage of the malaria life cycle in humans. The parasites first enter the bloodstream, dribbled in with the saliva of mosquitoes, as sporozoites, and then they quickly burrow into liver cells. There they multiply by the tens of thousands and emerge a week later as merozoites, which infiltrate red blood cells. These merozoites are the target of G25. Once inside the red blood cell, also known as an erythrocyte, a single merozoite produces some 20 progeny. These erupt from the cell, reinvade the bloodstream, and colonize yet more erythrocytes. This stage of the parasite's growth cycle is responsible for virtually all clinical symptoms of malaria, because the parasites can eventually colonize and destroy up to 70% of all red blood cells, causing severe anemia, fever, convulsions, coma, and death.

    Bloodsuckers.

    A new compound kills malaria parasites (yellow) within red blood cells.

    CREDIT: OLIVER MECKES/NICOLE OTTAWA/PHOTO RESEARCHERS

    To Vial and his colleagues, the parasites were vulnerable because of their need to package each of their erythrocyte-born progeny in protective lipid membranes. Uninfected erythrocytes, in contrast, engage in no lipid synthesis of their own. By targeting this synthesis, Vial says, “in theory we would be attacking metabolism that is not present in the host cell and so would not affect the host cell. But if you prevent the parasite itself from synthesizing lipids, it will not survive.”

    Over the course of 20 years of research, Vial and his colleagues dissected the pathway by which the parasite takes choline from blood plasma and converts it into the major component of its protective membranes. They then demonstrated that blocking synthesis of phospholipids stops parasite replication. Finally, they designed the newly reported compound, G25, to block the receptor for choline transport, which can be found both on the surface of the infected erythrocytes and on the membrane of the parasite sequestered within.

    The results were dramatic. In rodents and primates infected with P. falciparum, the most lethal form of malaria, G25 effected quick and total cures at low doses. “A few days after the first injection, all the parasites in the monkey were dead,” says team member Clemens Kocken of the Biomedical Primate Research Center in Rijswijk, the Netherlands.

    G25 is both easy to make and inexpensive—essential qualities for a drug that will be used in sub-Saharan Africa, where 90% of all malaria cases arise, and Southeast Asia, both of which are plagued by multidrug-resistant malaria. One drawback is that the drug must be injected. And, as Michael Gottleib, chief of parasitology at the National Institute of Allergy and Infectious Diseases in Bethesda, Maryland, points out, the researchers “obviously need a lot more toxicity data before it becomes obvious that this compound will be therapeutically effective.” Vial says his team hopes to have a more convenient oral candidate for preclinical studies within 2 years.

  4. PALEONTOLOGY

    Earliest Animal Tracks or Just Mud Cracks?

    1. Richard A. Kerr

    When they were first discovered, the wiggly grooves on slabs of ancient sandstone from central India were dramatic enough: They appeared to some to be 1.1-billion-year-old worm tracks. That date would push the earliest known record of complex animals back a startling half-billion years (Science, 2 October 1998, p. 19). But, it turns out, the first publication on the find was greatly understated.

    Two groups report in the February issue of Geology that the rock marked by the putative tracks is a whopping 1.6 billion years old. That predates the earliest generally accepted trace fossil of a complex animal—dated at 575 million years ago—by about a billion years. To some researchers, such a long gap strains credulity. Instead of traces of life, they are now seeing meaningless doodlings in ancient, squishy muds.

    The new, solid age for the Indian grooves comes from radiometric dating by two independent groups. They measured the clocklike rate of radioactive decay of uranium to lead in tiny crystals of zircon deposited with volcanic ash just before and just after the grooves formed. Both groups—one led by paleontologist Birger Rasmussen of the University of Western Australia in Crawley, the other by geochemist Jyotiranjan Ray, now at the University of Hawaii, Manoa—got ages of just over 1.600 billion years, give or take less than 0.008 billion years.

    “There's no question” that the groovy rock is very ancient, says geochronologist Samuel Bowring of the Massachusetts Institute of Technology, a co-author of the Ray paper. Dating by various other techniques that pointed to an age of 1.1 billion years or younger (Science, 16 April 1999, p. 412) must have been affected by alteration of the rock, Bowring says.

    Wormy or just groovy?

    These putative worm tracks are now dated at 1.6 billion years.

    CREDIT: A. SEILACHER ET AL., SCIENCE 282, 80 (1998)

    With doubts about the appropriate age resolved, the biological origins of the grooves become “even more exciting or more improbable,” says paleontologist Adolph Seilacher of Yale University, who with colleagues proposed that the grooves were formed by evolutionarily advanced worms burrowing just beneath the sea floor. “This age makes it unlikely these are animal trace fossils,” Seilacher says. “At the same time, I have to go with the evidence. I have not found or heard of any other explanation. Do we have any nonbiological interpretation of these things?"

    As it happens, the answer is yes. “No one is better in the field than Seilacher,” says paleontologist Mary Droser of the University of California, Riverside, but, on closer inspection, she finds that the grooves “look much more like cracks than trace fossils.” The details of groove diameter, the V shape of groove floors, and the irregular pattern of grooves all point to cracked mud rather than burrowing, Droser says. In addition, “you wouldn't expect a billion years without [similar traces].”

    The debate over the earliest traces of animal life “is a great dress rehearsal for when we get samples from Mars,” says Bowring. “How do you decide when something is biogenic? Paleontologists haven't completely come to grips with that.” Perhaps squiggly grooves from India can help prepare us for that encounter.

  5. BIOMEDICAL ETHICS

    Study of Brain Dead Sparks Debate

    1. Jennifer Couzin

    Renata Pasqualini and her husband Wadih Arap, biologists at the M. D. Anderson Cancer Center in Houston, Texas, had for several years been working on a new approach to designing targeted cancer drugs, but they were not sure how to test it on people. At the same time, they were deeply moved by the families of cancer patients they encountered. Watching loved ones decline, their brains silent, their bodies tethered to life support, the families sometimes offered to donate their relative's organs, but advanced cancer made that impossible. From this juxtaposition arose a novel experiment: Pasqualini, Arap, and their colleagues have infused millions of peptides into brain-dead and near-death patients to determine which ones end up in specific tissues.

    Despite the initial “yuck factor,” as Anne Flamm, a clinical ethicist at M. D. Anderson who helped design the protocol, describes it, she and others believe that with stringent informed consent procedures, such studies are ethically sound. And the first of the experiments, on a 48-year-old brain-dead man, reported in the February issue of Nature Medicine, has yielded a wealth of data.

    “Being able to get information from a human being, in vivo—not just taking cells out—has wide-ranging implications,” says Donald McDonald, a vascular biologist at the University of California, San Francisco (UCSF). “Everyone recognizes that this was a risk that [the researchers] took because of the [study's] obvious sensitivity.”

    Pasqualini, Arap, and their colleagues believe that tracking which peptides—short strings of amino acids—are drawn to blood vessels in certain tissues could pave the way toward drugs that might target those peptides, and hence the blood vessels feeding particular tumors. In the late 1990s, they helped establish that in mice, different peptides bind to blood vessels in different parts of the body, and that vessels feeding tumors differ from healthy ones. From tissue biopsies taken after infusing the peptides, the team determined which classes of peptides were present in each. But they worried that the same types of peptides would not migrate to the same blood vessels in humans.

    Picky.

    Certain peptides latch onto prostate blood vessels (top) but not skin (bottom) in tissue collected from a brain-dead man.

    CREDIT: W. ARAP ET AL., NATURE MED. 8, 126 (2002)

    Finding out posed ethical challenges: The multiple biopsies needed—of skin, muscle, bone marrow, prostate, fat, and liver—would be too invasive to gather from conscious individuals. So in late 1999, Arap and Pasqualini approached M. D. Anderson ethicists about the idea of experimenting on brain-dead and near-death patients.

    Flamm and fellow ethicist Rebecca Pentz scanned medical literature for precedents but unearthed few. In 1981, researchers received permission to test an artificial breathing device on brain-dead children; 6 years later a brain-dead man was infused with monoclonal antibodies.

    The pair recommended strict rules. First, the impetus to participate must come from families: Only after a family inquires about organ donation or research can it be told of the study. The procedure must last no more than an hour, and families of near-death patients must be warned that death could occur during the experiment. In early 2000, the hospital's Institutional Review Board (IRB) gave the green light for the team to infuse roughly 200 million different peptides into their subjects.

    Still, the studies have prompted ethical questions few have considered before. Elizabeth Hohmann, an infectious-disease specialist and chair of the IRB for Massachusetts General Hospital and Brigham and Women's Hospital, both in Boston, says she has never encountered proposals to experiment on brain-dead people on life support. Nor has John Falletta, a pediatric oncologist and lead chair of Duke University's IRB. If the body is respected, he says, “such research could be very important.”

    A smattering of hospitals seem to agree. Pasqualini's group has since infused peptides into two more individuals as part of the same study. The University of Pittsburgh in Pennsylvania recently approved two studies on brain-dead subjects on life support; one tests a device to treat heart and lung failure. And M. D. Anderson approved another study last May, in which patients declared dead are connected to a mechanical resuscitation device intended for those in cardiac arrest; researchers then determine whether it induces blood flow. “It can't inflict pain,” explains Lee Parmley, interim chair of critical care and the leader of the study.

    The second and third subjects in the Pasqualini team's study are not brain dead but “nearly dead”—unconscious patients on ventilators with failing organs but continued brain activity. This set prompted additional scrutiny to ensure respect for the patients' wishes.

    Although the team has published results on just one subject, scientists such as McDonald are impressed. The group homed in on certain sets of peptides that share similar amino acids, including one that appears specific to prostate blood vessels. But uncertainties remain. Due to their grave condition, these subjects may not be broadly representative, says UCSF ethicist Bernard Lo. In addition, the sheer number of peptides infused could interact with each other to skew results. Arap says that double-checking against other tissue samples to confirm results suggests that thus far, these problems haven't surfaced.

    Meanwhile, the biomedical community is notably silent, says Michael DeVita, a University of Pittsburgh physician. DeVita and three colleagues are planning a presentation at a conference this fall, where they will explore how the dead, on and off life support, may appropriately be used in research—and how they may not.

  6. NUCLEAR HISTORY

    Letters Aver Physicist Supported Nazi Bomb

    1. Adrian Cho*
    1. Adrian Cho is a freelance writer in Boone, North Carolina.

    For more than half a century, historians have speculated about a private conversation that took place in September 1941 between German physicist Werner Heisenberg and Danish physicist Niels Bohr. Long-secret letters released on 6 February by the Niels Bohr Archive in Copenhagen finally provide an answer. They flatly contradict claims made by Heisenberg after the war that he told Bohr he intended to subvert the Nazi bomb program from within.

    Eighteen months after German troops occupied Denmark, while the Nazi war machine was still crushing all in its path, Heisenberg traveled to Copenhagen to see his former mentor, Bohr. The two Nobel laureates talked in private, and Heisenberg said something about nuclear fission that so disturbed Bohr that the Dane abruptly ended both the exchange and their long friendship.

    Heisenberg later implied he had tried to signal that he knew it was possible to make an atomic bomb, but that he would subtly sabotage the German drive to do so. Bohr misunderstood his intentionally oblique language, Heisenberg said in a letter published in 1957 in Robert Jungk's history of atomic weapons, Brighter Than a Thousand Suns. Bohr disagreed with this account and drafted a letter to Heisenberg to set the record straight. He never posted the letter, however, and it surfaced only after Bohr died in 1962, folded into his copy of Jungk's book. The letter was to have remained sealed in the Bohr archive until 2012, but the Bohr family agreed to release it and 10 other secret documents ahead of schedule in response to the intense interest sparked 4 years ago by Copenhagen, the award-winning play by writer Michael Frayn that speculates about what the two men said. The archive published the documents on the Internet (http://www.nba.nbi.dk/).

    Fallout.

    Werner Heisenberg (left) and his mentor Niels Bohr, shown here in 1934, later split over German A-bomb research.

    CREDIT: PHOTO BY PAUL EHRENFEST JR., COURTESY OF AIP EMILIO SEGRÉ VISUAL ARCHIVES

    In the letter found in the book, Bohr writes: “You spoke in a manner that could only give me the firm impression that, under your leadership, everything was being done in Germany to develop atomic weapons and that you said that there was no need to talk about details since you were completely familiar with them and had spent the past two years working more or less exclusively on such preparations.” In another letter, Bohr explicitly repudiates Heisenberg's contention that he implied he would undermine the Nazi bomb program. “It is therefore quite incomprehensible to me,” Bohr writes, “that you should think that you hinted to me that the German physicists would do all they could to prevent such an application of atomic science.”

    Of course, the letters provide only Bohr's recollection of the conversation, says Gerald Brown, a physicist at the State University of New York, Stony Brook, who knew both men. “I don't think Bohr understood what Heisenberg was trying to say,” Brown says. Heisenberg, who died in 1976, had no reason to endanger himself by revealing the Nazi nuclear research program unless he was trying to deliver a deeper moral message, says Thomas Powers, author of Heisenberg's War: The Secret History of the German Bomb. “He thought that he got the word out in some form or another,” Powers says. “But Bohr makes it clear he didn't hear a thing.”

    However, Hans Bethe, a physicist and Nobel laureate at Cornell University who worked on the Manhattan Project, says he no longer believes that Heisenberg tried to make only a nuclear reactor. “The letter changed my view,” Bethe says. “It seems that in 1941 Heisenberg wanted to build a bomb.” After the war, Heisenberg had more reason than Bohr to “misremember” the facts when recounting the meeting, says Gerald Holton, a physicist and historian of science at Harvard University. “Niels Bohr had no reason to say something that wasn't true,” Holton says, “whereas Heisenberg had a real problem after the war, namely, explaining why the German group failed to do what they set out to do.”

    If Heisenberg was working in earnest on the German bomb effort, then his purpose in visiting Copenhagen was likely more personal than political, Bethe says. The Nazis threatened Bohr, whose mother was Jewish, and Heisenberg must have known that his visit would help secure Bohr's safety. “He was convinced that Germany would win the war,” Bethe says, “and he wanted Bohr and his institute to survive.”

  7. UNITED KINGDOM

    Parliament Takes Aim at Royal Society

    1. Adam Bostanci*
    1. With additional reporting by Anna Baynham.

    CAMBRIDGE, U.K.—A showdown is looming between Britain's oldest and most respected scientific institution and the U.K.'s House of Commons. Responding to long-standing concerns over elitism and discrimination against women at the Royal Society, the Commons' Select Committee for Science and Technology has launched a probe of how the society and similar institutions should use public money and how they elect members.

    The Royal Society, founded in 1660, received $37 million from the government last year, most of which it spent on postdoctoral research fellowships and travel grants. It also organizes meetings, publishes journals, and acts as an independent “voice of science” for the government. Each year, the society bestows lifelong membership on 42 new “fellows.” But despite a policy of equal opportunity, only 44 of its present 1216 fellows are women. Moreover, 62% of them are based in London, Oxford, or Cambridge, home to the country's top universities.

    Select Committee chair Ian Gibson, former dean of biology at the University of East Anglia in Norwich, says he wants to find out why the society's fellows do not reflect the makeup of the wider scientific community. He also wants to ensure that there isn't duplication of effort among the Royal Society, the Royal Academy of Engineering, and other learned societies in areas such as the popularization of science. “That outcome includes the possibility of more money for learned societies,” he says. His goal is to achieve “a complete revamp and modernization” of the Royal Society.

    Search me.

    Robert May welcomes the Commons' inquiry.

    CREDIT: AP PHOTO/THE ASAHI GLASS FOUNDATION

    Robert May, president of the Royal Society and former government chief scientist, told Science he acknowledges that the society is “working against the pyramid” of gender inequality and is actively trying to identify women scientists who may have been overlooked. It has also recently changed its nomination rules: Starting this year, a candidate needs to be nominated by only two fellows instead of six, which may make it easier for women to be nominated. “We also try to have women on all our committees, but that turns out to be a burden for [the female fellows], because there are so few,” says May. However, he says, “we will not have different standards of election [for men and women].”

    Early reaction from scientists supports that view. Plant scientist Lorna Casselton, a Royal Society fellow at Oxford University, agrees that doctoring the selection process to favor women would be unacceptable: “I don't think women would like to see double standards applied.” All the female fellows contacted by Science stressed that they had never experienced or seen any discrimination in the selection of candidates. “The problem is with society, not with the Society,” says physiologist Frances Ashcroft, a fellow at Oxford University. Fewer women follow careers in science, and the proportion of women in the Royal Society is the same as the proportion holding scientific chairs in British universities, she says.

    The Select Committee intends to call the Royal Society and other societies to give evidence after March. It will be an “interesting battle,” says Gibson. But he may have little power to influence the inner workings of the Royal Society. “Once the committee has discovered how we elect fellows, we will welcome its ideas,” counters May.

  8. PALEONTOLOGY

    Tug-of-War Over Mystery Fossil

    1. Sabine Steghaus-Kovac*
    1. Sabine Steghaus-Kovac is a writer in Frankfurt.

    FRANKFURT, GERMANY—Another blockbuster dinosaur find from China has sparked a disagreement between leading paleontologists in Germany and China. Last week, Friedrich Steininger, director of Frankfurt's Senckenberg Natural History Museum, tried to clear the air over his museum's purchase of a mysterious fossil amid claims that it was smuggled out of China illegally. But Chinese paleontologists insist that the specimen must be handed back. “It is more than clear that Chinese law forbids such exports of important vertebrate fossils,” says paleontologist Zhou Zhonghe of the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) in Beijing.

    One thing not in dispute is that many scientists are clamoring to see the find. The almost complete psittacosaur—a bipedal plant eater that's the size of a large dog and has a parrotlike beak—has a tuft of filaments on its tail that resemble a porcupine's quills. This is the first time such adornments have been found outside the theropods, the group that includes large bipedal carnivores such as Tyrannosaurus rex. “The discovery of these structures will certainly change the way we look upon the [skin] of dinosaurs,” says Gerald Mayr, a paleoornithologist at the Senckenberg.

    The fossil took a circuitous route to the Senckenberg. It first surfaced in 1997 at the Tucson rock show, a major marketplace for fossils and minerals. The following year the fossil was sold by a U.S.-based fossil dealer to a pair of European dealers, who arranged to have it exported legally under U.S. law. At the time, the psittacosaur bones were still mostly embedded in the limestone characteristic of China's 120-million-year-old Yixian formation.

    A tall tale.

    The disputed psittacosaur and its tail filaments.

    The fossil's European owners took it to the Natural History Museum in Milan, Italy, where researchers, while partially preparing the fossil, discovered parts of the skin and the quill-like filaments. Realizing that the specimen was an important find, one of the dealers, geologist Flavio Bacchia, invited an international group of experts, including IVPP's Dong Zhiming, to examine the specimen in 2000. Zhiming and others urged that the specimen be returned to China. That summer, a representative of the dealers went to Beijing to negotiate the psittacosaur's return in exchange for casts of feathered dinosaurs, but the deal foundered. “Bacchia was sincere in his desire to return the fossil to China, but he and his German partner had invested a lot in the specimen,” says Eric Buffetaut of France's CNRS research agency in Paris, a member of the expert group.

    After this setback, the dealers sold the fossil to the Senckenberg for about $70,000 in June 2001. Steininger says that the museum bought the fossil to save it for research: “We realized the scientific value of the specimen and decided that it must not disappear into a private collection.”

    Last month, German newspapers reported that the Chinese Society of Vertebrate Paleontology had gone on record as favoring repatriation of the fossil. But last week Steininger said that he has not responded to a letter from the society because it was not printed on letterhead and lacked a signature. “To date we have received no official inquiry from the IVPP to return the fossil to China,” he says. Steininger says he wants to work out a deal with Chinese officials whereby the specimen would be the property of China but would be exhibited at the Senckenberg. But IVPP director Zhu Min apparently has no interest in such a deal: “The prerequisite is that the fossil must return to China.” Once that happens, he says, “we can negotiate on whatever they want to discuss.”

    With reporting by Erik Stokstad.

  9. HOMESTAKE MINE

    Neutrino Lab Detects Heavy Political Fallout

    1. David Malakoff*
    1. With reporting by Jeffrey Mervis.

    The National Science Foundation (NSF) is still mulling over the merits of a $281 million proposal to convert a South Dakota gold mine into the world's deepest underground laboratory. But the politics are very much on the surface, with implications for NSF, other major research facilities, and, improbably, control of the U.S. Senate.

    The scientific plan before NSF officials would transform part of the 125-year-old Homestake Gold Mine into a 2300-meter-deep facility for studying elementary particles called neutrinos, as well as experiments in the geological and life sciences (Science, 15 June 2001, p. 1979). Pressed by a company-imposed deadline to close and flood the mine, backers last year decided not to wait for NSF's approval and took their case to Congress. Legislators obliged by appropriating $10 million to keep water out of the mine. Last December, lawmakers also wrote language to protect the mine's owner, Barrick Gold Corp. of Toronto, Canada, from future environmental lawsuits.

    But the deal, crafted by Senate Majority Leader Tom Daschle and Democratic colleague Tim Johnson, both of South Dakota, and the state's Republican governor, Bill Janklow, was modified at the last minute to satisfy House Republican leaders. Conservative talk radio host Rush Limbaugh and others had complained that the Senate language would have saddled U.S. taxpayers with millions of dollars in cleanup costs. But Barrick officials have threatened to pull out of the deal due to the House revisions that were adopted.

    The mine has also become an issue in the race between Johnson and Representative John Thune (R-SD), a contest that could decide which party controls the Senate. Johnson has said that House Republicans “dropped the ball” on the laboratory, implying that Thune would be responsible for losing the economically important project if the company pulls out. But in recent television advertisements, Thune says he worked hard to close the deal. Last week Thune claimed credit for convincing President George W. Bush to put nearly $10 million in his 2003 budget request to keep the project moving while NSF ponders the laboratory's scientific merit.

    In addition to being the cause of this political sparring, Homestake has made itself an uninvited presence in NSF's 2003 budget request. In a section on “early-stage planning for potential large facilities,” the White House instructs the agency to help the National Academy of Sciences design a study reviewing a planned $240 million neutrino detector at the South Pole, called IceCube, in light of “other proposed U.S. neutrino collectors … and planned neutrino research throughout the world.” IceCube is an expansion of an existing neutrino observatory called AMANDA.

    Sen. Johnson
    Rep. Thune
    Sen. Daschle

    Sen. Daschle

    Follow my lead. A proposed underground lab in Lead, South Dakota, has become a hot political issue.

    CREDITS: (JOHNSON AND THUNE) DOUG DREYER/AP; (DASCHLE) KEN CEDENO/AP

    NSF has also been told to convene a workshop “on all aspects of underground and/or neutrino research.” That language, explains Joe Dehmer, head of NSF's physics division, “ties together existing work on IceCube and underground labs, even though they do two quite different types of science.” IceCube will search for very high energy particles that originate beyond the solar system, whereas the Homestake lab is expected to probe lower energy neutrinos that will shed light on the sun and neutrinos themselves. The Bush Administration also has requested $2 million in NSF's physics program next year to support research on neutrino detectors. Although Dehmer says that new findings have made the field “a hot topic” for scientists, the top-down allocation troubles physicist John Bahcall of Princeton University in New Jersey. A major backer of both Homestake and IceCube, Bahcall says such priorities shouldn't be dictated by the White House.

    Homestake backers see the study as an endorsement of their claim that the underground lab is a cutting-edge research facility on a par with other projects, such as IceCube, that have already won NSF's backing. But IceCube's chief scientist, Francis Halzen of the University of Wisconsin, Madison, says he “welcomes the review, [because] we don't have anything to fear.”

    For Homestake, however, any studies would be moot if Barrick officials pull the plug—a decision that could come within a month. And even if the company agrees to transfer the land, the project needs the approval of NSF's 24-member governing board. That approval, if granted, would trigger another race, this time pitting the Homestake lab against NSF's growing, and increasingly expensive, list of facilities that it wants to build.

  10. BECOMING HUMAN

    In Search of the First Hominids

    1. Ann Gibbons

    Thanks to an astonishing series of fossil discoveries, researchers are at last glimpsing our earliest ape ancestors, back beyond 4 million years ago. The finds are shifting attention from the savanna to the woods—and changing ideas about what it means to be a hominid

    If portraits of our early ancestors were displayed in a museum of human evolution, the collection would include figures of nearly a dozen early humans who lived in Africa from 1 million to 3 million years ago. Visitors would see robust ape-men from the caves of South Africa; a long-limbed boy from Kenya who stood almost 6 feet tall; and lightly built toolmakers from Olduvai, Tanzania. On a pedestal by herself might be a statue of the matriarch called Lucy, a female the size of a small chimpanzee whose species, Australopithecus afarensis, walked upright through the East African bushland 3 million to about 3.6 million years ago.

    For 2 decades, Lucy stood alone as the first known human ancestor. Her species was thought by many to have given rise to all that came later, including our own lineage. But her own origins were a mystery. No matter where paleontologists searched, they found few fossils of protohumans more ancient than A. afarensis. All the older fossils—a few jaw scraps and a single tooth—could fit into the palm of a hand. “All we had were ill-dated scraps. We had almost no clues about what came before Australopithecus afarensis,” says paleoanthropologist Tim White of the University of California, Berkeley.

    But now researchers are adding a new wing to that gallery of ancestors. In the past few years, paleontologists have unearthed dozens of fossils of new kinds of primates, including at least three that may be the earliest members of the Hominidae, the family that includes humans but no other apes. Just last summer, a skull was discovered in Chad that may date to 6 million years ago, and new details were published on another ape-man who may have been walking upright in Ethiopia at about the same time. This came hot on the heels of the discovery in autumn 2000 of the equally ancient “Millennium Man” in Kenya. The faces and many features of these earliest hominids remain shadowy, but their outlines can be discerned, revealing apes the size of chimpanzees that walked upright through African forests.

    As paleoanthropologists begin to peer back beyond 4 million years ago, they are also filling in the details of the characters that followed, revealing stunning new fossils of hominids that lived 3 million to 4 million years ago, with descriptive nicknames such as Flat-Faced Man and Little Foot. “During this time, we're dealing with a wetter, warmer Africa that it seems was spawning hominids from the shores of Lake Chad to the caves of Sterkfontein [in South Africa],” exults Phillip Tobias of the University of the Witwatersrand in Johannesburg, South Africa. Adds Meave Leakey of the National Museums of Kenya: “If you look at the number of major discoveries, it's staggering.”

    Although scientists are still analyzing the oldest specimens and have yet to publish complete descriptions of these top-secret fossils—prompting one colleague to dub the study of the first hominids the Manhattan Project of paleoanthropology—the data thus far are already challenging old views.

    Toehold.

    Yohannes Haile-Selassie (above) says an Ardipithecus ramidus kadabba foot bone (above; top row, third from right) shows it walked upright.

    CREDITS: (TOP TO BOTTOM) ©1998 DAVID L. BRILL/BRILL ATLANTA; ©1996 DAVID L. BRILL/BRILL ATLANTA

    The first surprise is that more than one type of hominid may have been living between 6 million and 5 million years ago and that these very early hominids show diversity in their teeth and anatomy. That suggests a period of hominid evolution even earlier than most researchers have believed and also prompts questions about how reliably the molecular clock is calibrated (see sidebar on p. 1217). Another surprise is that the oldest hominids were walking upright yet living in woodlands, dealing a lethal blow to the hypothesis that bipedalism emerged when hominids first stood up and stretched their legs on the savanna. “These fossils are causing a paradigm shift,” says paleontologist Martin Pickford of the Collège de France in Paris, co-discoverer of Millennium Man. “A lot of old ideas will be put into the wastebasket.”

    Into the trash, in fact, may go the very definition of what it means to be a hominid, as there is now little agreement on what key traits identify an exclusively human ancestor. Nor is there agreement on which species led to Homo, or even whether the fossils represent different species or variation within a single species. “Preconceptions of a large-toothed, fully bipedal, naked ape standing in the Serengeti 6 million years ago are X-Files paleontology,” says White. “What we're learning is we have to approach this fossil record stripped of our preconceptions of what it is to be a hominid.”

    The First Family

    For 20 years, A. afarensis was without rival as the first known hominid. Lucy was discovered in 1973 at Hadar, Ethiopia, and since then her species has been characterized by 360 fossils from more than 100 individuals who lived from 3 million to about 3.6 million years ago. With the mix of traits expected in a human ancestor, A. afarensis helped define ideas about early hominids. Lucy was the size of a female chimpanzee, with long arms, a small brain, and a strikingly apelike jaw to match. But she also showed more derived, humanlike traits. She and 13 individuals of her species, dubbed the First Family, were bipedal and had thick tooth enamel, large molars, and smaller canines shaped like those of later australopithecines, reflecting a transition from a diet of fruits and leaves to one of hard roots, tubers, insects, and small animals, says paleoanthropologist William Kimbel of the Institute of Human Origins at Arizona State University in Tempe. Her curved fingers revealed grasping hands, whereas apes grasp with both feet and hands.

    Root ape?

    Tim White (top) thinks Ardipithecus ramidus led to Homo.

    CREDITS: (TOP TO BOTTOM) ©1994 TIM D. WHITE/BRILL ATLANTA; ©1999 TIM D. WHITE/BRILL ATLANTA

    But despite the bounty of A. afarensis fossils, researchers were stymied as they sought to discover Lucy's own roots. “Beyond 3.6 million years you were just in a black hole in the fossil record until you got back into the middle Miocene [about 15 million to 9 million years ago],” recalls White. And the “muddle” of ape fossils in the early Miocene, when apes underwent a burst of speciation and came in all sorts of body plans, made it difficult to sort out which anatomical traits were inherited from the common ancestor of chimps and humans—and which ones evolved only in apes or only in humans, notes paleoanthropologist Carol Ward of the University of Missouri, Columbia.

    For years the leading explanation was that the diverse Miocene apes went through a bottleneck, with only a few lucky apes emerging, including the ancestor shared by humans and chimpanzees. That root ape fairly quickly gave rise to A. afarensis, which in turn was the ancestor for everything that came later, including the Homo lineage.

    Then in 1992, the Middle Awash Research Team, co-led by White, made a discovery that ended Lucy's reign. About 75 kilometers south of Lucy's resting place, at Aramis in the Afar depression of Ethiopia, the team found fossils of a chimp-sized ape dated to about 4.4 million years ago. This creature earned its place on the human line “metaphorically and literally by the skin of its teeth,” as paleoanthropologist Bernard Wood of George Washington University (GWU) in Washington, D.C., wrote when the fossil was announced in Nature; many pieces were dental.

    Fossil trail.

    Many kinds of hominids lived in Africa 6 million to 2.5 million years ago, before Homo appeared.

    What White and his colleagues saw in the mouth of this ape was a mosaic of chimplike features they considered primitive, such as the shape of its baby molars, and more derived humanlike features such as a diamond-shaped canine rather than the honed V shape of chimps. The team named this species Ardipithecus ramidus, drawing on two words from the Afar language suggesting that it was humanity's root species. But skeptics argue that the published fossils are so chimplike that they may represent the long-lost ancestor of the chimp, not human, lineage.

    The next field season, team member Yohannes Haile-Selassie found the first of more than 100 fragments that make up about half of a single skeleton of this species, including a pelvis, leg, ankle and foot bones, wrist and hand bones, a lower jaw with teeth—and a skull. But in the past 8 years no details have been published on this skeleton. Why the delay? In part because the bones are so soft and crushed that preparing them requires a Herculean effort, says White. The skull is “squished,” he says, “and the bone is so chalky that when I clean an edge it erodes, so I have to mold every one of the broken pieces to reconstruct it.” The team hopes to publish in a year or so, and White claims that the skeleton is worth the wait, calling it a “phenomenal individual” that will be the “Rosetta stone for understanding bipedalism.”

    And a few clues to Ardipithecus emerged last year, when Haile-Selassie published fossils of an older subspecies from the Middle Awash, called A. ramidus kadabba and dated to 5.2 million to 5.8 million years ago. These fossils have literally a toehold on the hominid branch of the ape family tree: Their classification rests partly on a nearly complete foot bone that the team thinks was used to “toe off” in a manner seen only in upright walkers.

    Millennium Man

    The other chief contender for first hominid was discovered by a joint Kenyan-French team in October 2000, some 1200 kilometers southwest of Aramis in the Lukeino Formation of Kenya's Tugen Hills. In a short paper published last year in the Comptes Rendus de l'Académie des Sciences des Paris, paleontologist Brigitte Senut of the National Museum of Natural History in Paris, Pickford, and their colleagues introduced these 13 fossils as Orrorin tugenensis, from the Tugen words for “original man” (Science, 13 July 2001, p. 187).

    Spying the thigh.

    Brigitte Senut and Martin Pickford (above) discovered O. tugenensis's thighbone; they say it shows signs of upright walking.

    CREDITS: MARC DEVILLE/GAMMA

    Senut and Pickford put the fossils at 5.8 million to 6.1 million years old, although a rival team (Science, 13 April 2001, p. 198) now dates the Lukeino Formation at 5.72 million to 5.88 million years ago; those radiometric dates were published this month in the Journal of Human Evolution (JHE) by paleoanthropologist Andrew Hill of Yale University and colleagues.

    Whatever the precise age, the find is sensational. Senut and Pickford say O. tugenensis “is definitely a hominid”—a bold claim that rests primarily on three thighbones, or femora. Their initial report focused on the top end of the femur, which they said was “more humanlike than those of australopithecines.” In fact, they propose that O. tugenensis walked more like humans than Lucy did, based on six features, including the size and shape of the head and neck of Orrorin's femur.

    The implications are startling: If Senut and Pickford are right, that suggests that Millennium Man is the ancestor of Homo and that Lucy was not the mother of us all (see diagram on p. 1214). Otherwise, it implies what Pickford calls “yo-yo” evolution, where humanlike bipedalism evolved in O. tugenensis, was modified in A. afarensis, then later returned to a human kind of walking.

    Who begat whom?

    Researchers have a new view of hominid diversity through time, but the picture is full of question marks—indicating uncertainty about dates, classification, and lines of descent.

    Thus Senut and Pickford argue that O. tugenensis is ancestral to Homo by way of a proposed genus called Praeanthropus, which includes certain fossils now assigned to A. afarensis and A. anamensis. They also suggest that Ardipithecus gave rise to chimpanzees.

    This view, so far based chiefly on the femur, has been greeted with skepticism. “There is nothing in the announcement that makes that femur bipedal,” says Missouri's Ward. “The onus is on them to prove it.”

    However, those who have seen casts of the fossils say that Senut and Pickford have a leg to stand on: “As a working hypothesis, I think they are correct, although they don't have the most diagnostic set of fragments,” says Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York City. To cement their claim, Senut and Pickford have made new computed tomography scans, which reveal that the structure of the bone in the neck of the femur resembles that of hominids rather than apes. They have also revealed a groove on the back of the femoral neck for the attachment of the obturator externus muscle, which they say shows that the bone was remodeled by upright walking.

    But it may take more than legs to make a hominid. Senut and Pickford, fresh from the field in November, say they now have a total of 22 fossils of O. tugenensis from at least six individuals from four sites, including a thumb bone and “pretty much the entire adult dentition,” says Pickford. He reports that O. tugenensis has thicker tooth enamel than Ardipithecus ramidus. And he notes that O. tugenensis and Homo both have small molars relative to their bigger bodies, a complex not shared by australopithecines, including Lucy, who have big teeth for their small bodies. But A. ramidus has the edge in its hominidlike canines, while O. tugenensis has small V-shaped canines like a female chimp's (see character table on p. 1218).

    View this table:

    Each team ranks the importance of these traits differently and so comes to a different conclusion about ancestry. In fact, only a few parts of each species have been published so far, and thus it's possible that both teams' fossils are the same creature. “I think there's a good chance that Orrorin is really Ardipithecus,” says White's collaborator, C. Owen Lovejoy of Kent State University in Ohio. Lovejoy suggests that the differences in the two apes may merely be variation within the genus of Ardipithecus, which he suggests was the root ape that once ranged throughout Africa—a view Senut and Pickford strongly protest.

    Into the woods

    While each team has been analyzing its fossils and preparing its case, a third team has unearthed an equally ancient, as-yet-unpublished skull that may shed light on the competing claims. The skull was discovered last summer in the Djourab Desert in Northern Chad, in a layer of sediment that may date to over 6 million years. Members of the French-Chadian Paleoanthropological Mission, led by paleontologist Michel Brunet of the University of Poitiers, won't discuss details until they publish, but those who have seen the skull are intrigued by its mix of old and new traits. “You only have to look at Brunet's skull to see that things were more complicated than we thought,” says GWU's Wood.

    The desert where Brunet's team found the skull is perhaps the most hostile environment for plumbing human origins. One field rule is never to touch metal, as it might be a land mine, and the wind is relentless. But that wind moves the dunes and exposes new fossils, says Brunet. Since 1994, his team has found an amazing 8000 fossils, mostly animals, at 300 sites dating from 3 million to 7 million years ago. Their discoveries include a hominid lower jaw dated to 3.5 million years ago that Brunet has assigned to a new species, A. bahrelghazali, though others suspect it may be A. afarensis.

    If the older, unpublished skull proves to be a hominid, it would kill once and for all the old idea that all hominids evolved on the east side of the African Rift Valley, where most fossils have been found, and that the other African apes evolved on the western side. “Chadian hominids show that part of our human story is in West Africa,” says Brunet.

    Digging the desert.

    Michel Brunet searches for fossils in Chad.

    CREDIT: SIGMA-CORBIS PARIS

    And it would cast more doubt on the once-popular idea that bipedalism emerged after climate change forced apes out of the trees into the grasslands. Animal remains associated with Brunet's putative hominid fossils—including monkeys and a species of extinct pig—indicate that they may have been living in trees near the shores of ancient Lake Chad, perhaps on the edge of a vast, barren steppe or desert.

    Those environmental details are significant, says White, because they fit with the ancient environment at Ardipithecus ramidus sites in Ethiopia. Analysis of the carbon chemistry of the soils at those sites shows that A. ramidus was not living in grassy savanna, but probably in a forested upland, says paleoanthropologist Stanley Ambrose of the University of Illinois, Urbana-Champaign. And the older A. ramidus kadabba also roamed a thick forest, 25 kilometers to the east.

    At about the same time, the small chewing teeth of O. tugenensis suggest that it, like Ardipithecus, was eating soft fruit and leaves as it foraged through the trees. Soil chemistry and the mix of animal fossils support a wooded environment for the Lukeino Formation too, according to both Pickford and Hill. The bottom line: Thus far, “all older hominids have been found in forested environments,” notes Ambrose.

    If these ancient forest-dwellers do prove to be bipedal, upright walking may have started in the forest, for any number of reasons, such as to carry food, display strength, attract mates, or use tools, says paleoanthropologist Henry McHenry at the University of California, Davis. And it may be that these different hominids had more than one way to walk upright, an idea that gets support from yet another new discovery, this time of a later hominid. South African workers recently unveiled a spectacular 3.3-million-year-old australopithecine skeleton, still partly encased in rock in the Sterkfontein Caves, with an unusual, slightly divergent big toe. Known as Little Foot, this nearly complete skeleton resembles its northern contemporary, A. afarensis, but it has yet to be classified. Although Little Foot was an upright walker, that big toe could have been used to grasp tree limbs and may have created a gait different from Lucy's, says discoverer Ron Clarke of the University of the Witwatersrand (Science, 5 May 2000, p. 798).

    Such diversity in walking styles means that the signature of bipedalism in the bones may vary among upright apes, says paleoanthropologist Jeff Schwartz of the University of Pittsburgh—and even raises the possibility that bipedalism evolved more than once.

    Lucy's origins

    The new insight into ancient hominids' preferred habitat is helping paleontologists find them. “We're learning that these hominids are not ubiquitous; they were restricted to certain habitats,” says White. “Often we find them with seeds, fossil woods, abundant monkeys, and kudu [forest-dwelling antelopes] but lack of abundant aquatic mammals. Often we find them where carnivores have destroyed a lot of the bone. It's this signature that says slow down, stick on this patch, and you're likely to find a hominid.”

    In fact, one reason it took so long to find fossils older than 4 million years is probably because fossil hunters were scouring the wrong places. Through trial and error, paleoanthropologists have learned that the open shorelines of ancient lakes and open grasslands—where later hominids are found—contain few traces of our earliest ancestors. For example, Meave Leakey spent much of the 1990s painstakingly gathering fossils from what were the swampy shores of an ancient river at Lothagam, in northern Kenya. But she found no hominids in that layer, although a scrap of jaw came from an upper layer. “It clearly wasn't where the hominids were,” says Leakey.

    But Leakey had much better luck in slightly younger rocks. Throughout the 1990s, on the scrubby shores of Lake Turkana, Kenya, she and her colleagues found the best contender for the long-sought ancestor of Lucy herself: A. anamensis, which means “of the lake” in the local Turkana language. The 88 fossils, which include many fragmented teeth, several jaws, part of a humerus, and possibly a shinbone, reveal a bipedal australopithecine with a narrow, apelike lower jaw. The fossils were dated to 3.9 million to 4.2 million years ago and were found in what were once the tree-lined banks of an ancient river.

    A. anamensis appears after a major shift at these sites, from browsing species to those that eat more grasses, according to work Leakey published in JHE last November with paleontologist Alan Walker of Pennsylvania State University, University Park. That's a sign that “hominids are beginning to get into more open country,” says Leakey. “They would have been eating fruit, insects, small mammals, and perhaps some bird eggs and were predominantly bipeds with tree-climbing [ability].”

    This progression into the bush continued, as Leakey showed with another discovery in 1999, of a skull and jaw fragment of a new species called Kenyanthropus platyops, nicknamed Flat-Faced Man. By 3.5 million years ago, this species, whose flattened face resembles that of a fossil called H. rudolfensis, was moving between grasslands and wooded habitats on the western side of Lake Turkana. Leakey suggests that both fossils may fit into the new Kenyanthropus genus. If that classification holds up (the K. platyops skull was so damaged that many question its classification), it adds another character to the hominid cast from 4 million to 3 million years ago. In addition to Lucy's species, A. afarensis, the players now include K. platyops and A. bahrelghazali at 3.5 million years, Little Foot at 3.3 million, and possibly the proposed genus Praeanthropus. Although Little Foot may prove to be A. afarensis, Clarke and his colleagues last September announced six even older and more apelike australopithecine fossils from Sterkfontein in South Africa. Thought to be about 3.3 million to 3.5 million years old, these are still being classified.

    Single line or bushy tree?

    Given all this diversity, it is “quite obvious that australopithecines lived all over Africa,” says Walker. But he thinks that all these new fossils may represent diversity within single species that unfolded into each other in a linear procession. Although the number of new species has doubled in the past decade, Walker cautions that they are spread over millions of years. “I think there's no strong evidence that there's anything more than one evolving hominid from 6 million years to 2.5 million years,” he says. White and his collaborators share this linear view, even connecting the dots between species, saying that Ardipithecus ramidus gave rise to A. anamensis, then A. afarensis on down to Homo, with some diversity at about the time Homo emerges.

    An ancestor's smile.

    The teeth, jaw, and other bones of A. anamensis suggest that it is Lucy's ancestor.

    CREDIT: ALAN WALKER/ © NATIONAL MUSEUMS OF KENYA

    But the field is deeply divided over this issue. When researchers such as Leakey, Wood, Tattersall, Pickford, and Senut look at the new fossils, instead of a parade of hominids, they see a bushy tree with different hominids hanging off different branches at the same time, making it difficult to draw a clear line of descent. “We're seeing a radiation,” says Wood. “If you look at other mammals, what's so unusual about that?” Indeed, says Tattersall, “the big lesson from each of these new finds is that diversity [in anatomy and species] was present from the start.”

    Defining what is special about the human lineage gets harder as the fossils get older and older. “I just told my students, 'I'm sorry, but I don't know how to distinguish the earliest hominid from the earliest chimp ancestor anymore,'” says Wood. Others say there are a few signs of hominid status—at least for now. “Right now the two key traits are bipedality and canine reduction and shape modification,” says Arizona State's Kimbel. “As we go back further in time, it will be fascinating to see if one of these fades away, leaving the other as the seminal hominid modification.”

    Even the current favorite trait, bipedalism, may not be enough to qualify as a hominid if other ancient apes were bipedal too. In the late Miocene, “there was a whole proliferation of these apes, sometimes running around on two legs, sometimes not. Why do they have to be ancestral to us?” wonders paleoanthropologist Peter Andrews of the Natural History Museum in London.

    For casual visitors to that museum of human evolution, all the early figures may look similar—and very much like other apes. But in one ape-man's smile or stance, researchers hope to find the hint of things to come.

  11. BECOMING HUMAN

    New Fossils Raise Molecular Questions

    1. Ann Gibbons

    Paleontologists aren't the only researchers tracing the ape family tree: For years, molecular biologists have been analyzing our family relations by scanning the DNA of living primates and tallying the number of mutations that have occurred over time in comparable stretches. Almost every study has concluded that humans and our closest relatives, the chimpanzees, last shared a common ancestor about 5 million to 7 million years ago.

    But with paleontologists uncovering two or more hominids already on different evolutionary paths by about 6 million years ago (see main text), some researchers say that the timing is getting too close for comfort. By molecular reckoning, before 7 million years ago there shouldn't even be a clear “hominid” lineage. That raises the question: “Has our molecular clock been correctly calibrated?” asks Phillip Tobias of the University of the Witwatersrand in Johannesburg, South Africa. For now, there's enough fudge in both kinds of data to make a consistent scenario, but some geneticists are reviewing their calculations.

    The first molecular study back in 1967 dated the split between humans and apes to 5 million years ago, and Vince Sarich of the University of California, Berkeley, co-author of the study, still stands by that date. “I still bet that either the morphology or dates or both will be found wanting for these '6-million-year-old hominids,'” he says.

    Since Sarich's work, about 10 studies have done the analysis with different stretches of DNA, and most came up with the same range of dates, says evolutionary biologist Blair Hedges of Pennsylvania State University, University Park. In the most recent study, published in the December issue of the Journal of Heredity, Hedges's team compares 50 nuclear genes among great apes and Old World monkeys and concludes that humans split from chimpanzees 4.5 million to 6.5 million years ago.

    Even if the new fossils hold up as hominids, Hedges says the data fit, allowing half a million years for hominids to diversify into the fossil genera Orrorin and Ardipithecus. “If [paleontologists] have something at 6 million years [ago], no problem,” Hedges says.

    But at least one geneticist consistently comes up with much earlier dates, because he uses a different fossil calibration point. Researchers must calibrate their molecular clocks—that is, calculate how many nucleotide changes occur per million years—by using a date from the fossil record. Most use the split between apes and monkeys, typically between 20 million and 25 million years ago. But geneticist Ulfur Arnason of the University of Lund in Sweden thinks that the ape-monkey split is poorly recorded in fossils and probably occurred twice as long ago, about 50 million years ago. To calibrate his clock, he uses several fossil data points that he considers more reliable, such as when whales split from even-toed ungulates, which he dates to 60 million years ago. As a result, he has dated the human-chimp split to 10.5 million to 13.5 million years ago.

    Those dates are now getting a second look. But Hedges and phylogeneticist Morris Goodman of Wayne State University in Detroit stick by their analysis, saying it requires a primate calibration point because mutation rates may have sped up in primates. At any rate, both geneticists and paleontologists will be watching the molecular clock as the new fossils are evaluated.

  12. BECOMING HUMAN

    What Made Humans Modern?

    1. Michael Balter

    Could our species have been born in a rapid burst of change? Researchers from different disciplines are trying to find out

    CAMBRIDGE, MASSACHUSETTS, AND CAMBRIDGE, U.K.—Three hominid skull casts sit in a row on Daniel Lieberman's desk, their empty eye sockets staring eerily ahead. If they could see, they might catch a glimpse of Harvard University's peaceful green quad, just outside the anthropologist's window. But these skulls bear witness, between them, to some of the most dramatic events in human prehistory, including the mysterious birth of our own species, Homo sapiens.

    The first skull, perhaps 300,000 years old, was found in Zambia. It comes from a species that may have been ancestral to both modern humans and Neandertals. The second is a Neandertal from France dating back 70,000 years. And the last is a 100,000-year-old H. sapiens discovered in Israel.

    Lieberman picks up each skull in turn and pokes a pencil up through the eye socket. “Look at the difference,” he says. “When I do this with the modern human, I touch the underside of the frontal lobe. But with the other two, my pencil ends up under the thick, bony brow ridge.” In modern humans, he explains, the face and eyes are tucked under the braincase, rather than thrust forward prognathously, as in all other now-extinct human species. And the modern human skull is globular like a volleyball, instead of oblong like a football.

    Poking into human origins.

    Daniel Lieberman thinks a few genetic changes might have produced the Homo sapiens skull.

    CREDIT: GRAHAM RAMSAY

    In Lieberman's view, these two traits—rather than the long list of characters anthropologists usually rely on—are the key distinguishing features of modern human skulls. And, he says that this reshaping of the skull, which may have accommodated an expansion in the key frontal or temporal lobes of the brain, was produced by small evolutionary adjustments in a few bones along the base of the skull, possibly due to only a handful of genetic changes. If he's right, the rise of modern humans may have been a relatively abrupt event rather than a gradual evolution.

    “It shows that the speciation event doesn't have to be complicated, with a lot of steps,” says Lieberman. “You may only need one change, not 15 or 20 changes.”

    Lieberman's bold proposal is the latest entry in a newly invigorated debate over the making of modern humans. A flurry of new evidence from three sources—fossils, art and artifacts, and genes—is forcing researchers to rethink just what traits mark the origin of our species and how and when these traits appeared.

    Some of this new evidence challenges the notion that the development of modern humans was a slow, gradual process. Whereas Lieberman zeroes in on a few key anatomic traits, for example, other studies raise the possibility that the birth of H. sapiens hinged on only a handful of genetic changes. Some geneticists are even hunting for a few mutations that might have helped launch hominid brains into cognitive hyperspace.

    But other researchers are skeptical that the rise of our species can be explained by the throwing of just a few genetic switches. Indeed, archaeologists—some of whom have long argued for a rapid explosion of cognitive abilities—are now digesting new evidence implying a more gradual development of sophisticated behavior. “In Africa, where our species emerged, we don't see any sudden leaps,” argues anthropologist Alison Brooks of George Washington University (GWU) in Washington, D.C. She and other scientists see the birth of our species as a gradual process of both physical and behavioral change, nurtured by climatic and environmental factors (see sidebar on p. 1225).

    Which scenario is correct is still an open question. “Are we talking about 10 or 1000 or 10,000 significant genetic changes?” asks Oxford University molecular geneticist Chris Tyler-Smith. “We don't know.” But the new wave of research, as well as heightened cross talk among disciplines, is boosting the chances that these different lines of evidence may eventually converge on a consistent scenario for how our species came to be. “In the past, we've had a pretty simplistic view of what modern humans were,” says paleoanthropologist Leslie Aiello of University College London. “Now, we are entering a very exciting period where we are beginning to be able to piece things together.”

    Modern skulls, modern brains?

    Over the past decade, the quest for modern human origins has crossed at least one critical Rubicon: Many researchers now think they know where, and roughly when, H. sapiens appeared. The place: Africa. The time: between 100,000 and 200,000 years ago. Geneticists, for their part, have analyzed current human genetic diversity across the world and then extrapolated backward, using mutation rates as a “molecular clock.” Their studies conclude that all modern humans are descended from an ancestral population that lived in Africa sometime after 200,000 years ago, with many dates converging on about 130,000 years.

    Many fossil experts come up with a similar story. Most agree that a 130,000-year-old human skeleton from Omo Kibish in Ethiopia and 100,000-year-old fossils from the Klasies River mouth in South Africa are “anatomically modern": That is, on the evidence of the shape of these bones, they belong to our species.

    Although this is now the majority view, some dissenters remain, including paleoanthropologist Milford Wolpoff of the University of Michigan, Ann Arbor. They continue to argue for the multiregionalism theory: that humans have belonged to the same species for nearly 2 million years and evolved simultaneously across the globe.

    This has been a bitter conflict for 2 decades, and it is far from over. But the growing consensus on African origins means that “we can now devote our energies into delving into the African record and narrowing down where and why these changes happened,” says paleoanthropologist Chris Stringer of the Natural History Museum in London, a leading Out-of-Africa advocate.

    Indeed, for those who subscribe to the Out-of-Africa hypothesis, it might seem a fairly straightforward matter to zero in on what happened on that continent about 130,000 years ago or earlier, particularly with respect to the skull and brain. But life and evolution are not so simple.

    Take overall brain size, a feature long thought to be roughly correlated with thinking power. Modern humans, with an average cranial capacity of 1300 to 1400 cubic centimeters (cc), do have somewhat larger brains than those of most earlier humans; the brains of H. heidelbergensis, for example, a species that some researchers think was ancestral to both humans and Neandertals, weigh in at 1000 to 1300 cc.

    But top honors for brain size, at 1400 cc, go to the Neandertals, considered by many researchers to be a separate species from us. Even when researchers adjust for Neandertals' more robust bodies, their brains were still only slightly smaller than ours.

    This has led some researchers to suggest that Neandertals, who coexisted for thousands of years with H. sapiens until they went extinct about 30,000 years ago, could have been as smart as modern humans. “It might be good to think of Neandertals as the other modern species,” says anthropologist Christoph Zollikofer at the University of Zürich. “The large brains of Neandertals and modern humans might represent parallel evolutionary achievements, and their different cranial shapes would reflect different evolutionary strategies to pack a large brain into a small space.”

    Yet other scientists doubt that Neandertals were as intelligent as modern humans, given their less sophisticated tools and limited symbolic behavior. This debate too is far from resolution (Science, 14 September 2001, p. 1980), but it's clear that overall brain size is not the whole story.

    If modern human brains aren't outstanding in overall size, then what are our distinguishing characteristics? Lieberman is one of a long line of physical anthropologists to focus on the shape of our skulls. In 1982, for example, Stringer and paleoanthropologist Michael Day, now also at London's Natural History Museum, proposed seven skull characteristics—such as the rounded shape of the parietal bones on the roof of the skull and very reduced brow ridges—as diagnostic for H. sapiens. Other researchers have since added other items to the list. “This is the only guide we've had for the past 20 years,” says Aiello.

    Using these criteria, several skulls fall on the borderline between modern and “archaic” forms. This has led many researchers to conclude that the first anatomically modern humans represent a continuum of gradual changes that began hundreds of thousands of years earlier—not a sudden speciation event. “I see a rather gradual evolution from an ancestral early archaic grade of Homo sapiens … to early anatomically modern humans,” says paleoanthropologist Günter Bräuer of the University of Hamburg in Germany.

    When Lieberman approached the problem in a new way, however, he came up with a much more clear-cut difference between archaic and modern skulls. In a study published online last month by the Proceedings of the National Academy of Sciences, Lieberman and his co-workers measured the Stringer and Day characteristics and other features of more than 119 skulls. They included 100 recent H. sapiens from around the world, 10 older anatomically modern humans, five Neandertals, four H. heidelbergensis, and a number of modern human children and chimpanzees at various growth stages.

    The group performed a series of computer-based analyses on these raw data. First they identified those combinations of traits that best account for the variation in shape between modern human and other hominid skulls. Much of the variation could be boiled down to the two features of globularity and facial retraction, or how tightly the face is tucked under the braincase, says Lieberman. And these two characteristics appear sufficient to distinguish modern from archaic skulls, with no overlap.

    What's more, the growth patterns of humans and chimpanzees showed that these uniquely human features stem from early developmental shifts in the bones making up the cranial base. For example, the anterior (forward) segment of the cranial base is 15% to 20% longer relative to cranial size in modern humans than in either extinct species, and the base is bent at a much sharper angle in moderns. That flexing allows the face to grow tucked under the braincase rather than jutting forward, explains Lieberman (see figure below).

    Angling for brain space?

    A sharper flexion of the cranial base in modern humans (bold line, right) compared to archaic forms (left) tucked the face under the frontal lobes.

    CREDIT: MUSEE DE L'HOMME/NATIONAL MUSEUM OF NATURAL HISTORY

    The fact that these traits arise almost entirely during prenatal and infant development is important, he says: This kind of early alteration in growth pattern, rather than later developmental tinkering, can create major changes in body form.

    That argument was bolstered by a study of Neandertal and modern human infants and children, published last year in Nature by Zollikofer and anthropologist Marcia Ponce de León, also at the University of Zürich. They too found that key differences between Neandertals and modern humans, including the angle of the cranial base, emerge very early, probably before birth. “Their analysis fits my model perfectly,” says Lieberman. Zollikofer agrees: “[We] start from different ends to reach similar conclusions.”

    Haven of modernity?

    Discoveries at South Africa's Blombos Cave may push evidence for sophisticated behavior back to 77,000 years.

    CREDIT: NATIONAL SCIENCE FOUNDATION

    As for what triggered this alteration in skull shape, Lieberman proposes that the crucial factor may have been a relative expansion of the frontal or temporal lobes. Physical anthropologists argue about whether that expansion is proportionately greater than the overall brain expansion in humans. But neuroscientists, who have spent decades probing and scanning the frontal lobe, do associate this region with many of the hallmarks of modern human behavior, such as creative thinking, artistic expression, planning, and language; the temporal lobe is linked to hearing and memory.

    Expansion of those areas would lengthen the anterior segment of the cranial base as well as push the face into a more vertical position, Lieberman notes. And, he says, studies of brain deformations in living infants show that “the shape of the brain changes the shape of the braincase and not vice versa”—making it likely that natural selection acted on the brain and that braincase shape followed.

    But, although Lieberman's study may suggest a sharp distinction between archaic and modern skulls, other researchers question some of his assumptions. Stringer cautions that Lieberman's analysis requires using fossil specimens that have relatively complete faces and crania. That leaves out a number of incomplete—and perhaps transitional—skulls that might represent early moderns. And Brauer comments that some fossils in Lieberman's sample may not be modern human ancestors. “If you took 10 different skulls, perhaps you would see an overlap” between archaic and modern human forms, he says.

    Others take issue with the notion of narrowing the Stringer-Day criteria to just two traits. Other, independent features, such as reduced brow ridges, are important too, says paleoanthropologist Ian Tattersall of the American Museum of Natural History (AMNH) in New York City. Stringer says that “other parts of the skeleton, such as the pelvis, may also be relevant” in distinguishing our species.

    But Tattersall agrees with Lieberman's overall conclusion that the modern human skull may be the result of a small number of evolutionary events. “I would be very surprised if the distinctions between Homo sapiens and its closest relatives were not due to a relatively small genetic change with major developmental consequences,” as Lieberman has suggested, says Tattersall. And the Lieberman and Zollikofer studies “certainly back each other up in saying that there are a few fundamental features which are key to the differential growth patterns” between modern and extinct humans, says Stringer.

    Although they haven't won universal acceptance yet, Lieberman's new criteria for anatomical modernity “advance the debate” and “are probably as good as anything we have right now,” says Aiello of University College London.

    But physical anthropologists have yet to deal with a remaining issue, Aiello notes: Do these innovations in skull anatomy really add up to making modern humans a truly different species, that is, a separate group that could not breed with Neandertals or other extinct humans? That's the definition of species on which many evolutionary biologists insist. “Nobody's cracked this question yet,” she says.

    Revolution or evolution?

    While anatomists ponder the link between modern skulls and modern brains, archaeologists are cataloging what may seem to be more direct clues to ancient minds: the tools, hearths, artwork, and other traces that early humans left behind. “A species is as a species does,” says anthropologist Stanley Ambrose of the University of Illinois, Urbana-Champaign. But the archaeological record has led to a perplexing puzzle: There is relatively little evidence for dramatic changes in behavior until long after the appearance of modern anatomy.

    Many researchers have thought that the first signs of truly modern behavior did not appear until about 50,000 years ago, during the so-called Later Stone Age (LSA) in Africa. This was soon followed by what some have called a “human revolution” starting about 40,000 years ago during the Upper Paleolithic period in Europe. The archaeological record seems to explode with creative activity, including personal ornaments, elaborate ritualistic burials, and fantastic cave paintings, such as the 32,000-year-old artworks at the Grotte Chauvet in France (Science, 12 February 1999, p. 920).

    This burst of culture has led some scientists, notably archaeologist Richard Klein of Stanford University in California, to propose that about 50,000 years ago, the human lineage underwent a genetic change that boosted the brain's cognitive powers. This mutation, Klein argues, unleashed many of the abilities we associate with modern humans, including language, abstract thought, and symbolic expression. “The [anatomy] stayed the same,” Klein says. “But look what happens to the cultural record. Before, [the brain] had just one simple sort of software. Now, you get hardware that can run all kinds of software.”

    But the conclusion that modern behavior came so late—and so suddenly—is now under attack. Perhaps the most dramatic onslaught came last month, when a team led by archaeologist Christopher Henshilwood of the South African Museum in Cape Town reported what it claims is the world's oldest art—two 77,000-year-old pieces of red ochre engraved with geometrical designs, found at Blombos Cave on South Africa's south coast (Science, 11 January, p. 247, and p. 1278 of this issue). “This is a fantastic discovery,” says GWU's Brooks. “It is really proof of symbolic behavior at this early date.”

    Indeed, even before the Blombos research was published, Brooks, together with paleoanthropologist Sally McBrearty of the University of Connecticut, Storrs, had thrown down the gauntlet to those who espoused the cultural explosion idea.

    But is it art?

    Archaeologists debate whether these very ancient ochre engravings (top) are evidence of symbolism as seen in the younger Grotte Chauvet paintings (bottom).

    CREDITS: (TOP TO BOTTOM) MINISTERE DE LA CULTURE ET DE LA COMMUNICATION, FRANCE; NATIONAL SCIENCE FOUNDATION

    In an article 2 years ago in the Journal of Human Evolution (JHE), entitled “The Revolution That Wasn't,” the pair argued that the real roots of modernity could be found long before 50,000 years ago. They cited a number of recent excavations in Africa, including Blombos, that reveal sophisticated stone- and bone-tool manufacture, advanced hunting and fishing skills, and well-developed exchange networks—evidence, they claim, for modern behavior tens of thousands of years earlier.

    Although they agree that such evidence is much more abundant during the Upper Paleolithic, the cognitive transition “wasn't sudden,” says Brooks. “It was improving on a basic plan that was already there.” Most advances in behavior during this later period, they argue, were due to cultural rather than genetic evolution—the genetic changes had already happened.

    Henshilwood adds that the Blombos finding might turn out to be the “tip of the iceberg” once more pre-LSA African sites are excavated. Brooks agrees: “We see numerous intimations that humans … already had these abilities,” she says, including ancient ostrich eggshell beads from Mumba Rock Shelter in Tanzania and barbed bone points from Katanda in the Congo. Indeed, last December in JHE, Henshilwood's team reported finding a cache of elaborately worked bone tools (often interpreted as evidence of modern behavior) in Blombos layers also dated to about 70,000 years ago.

    But such artifacts are considered by many to be less convincing than, say, actual art. And some researchers are unwilling to draw broad conclusions from the red ochre designs. Klein, who has worked at Blombos as an animal-bone analyst, says that “the meaning of these pieces will remain debatable so long as they are unique.”

    New York University archaeologist Randall White argues that neither the isolated red ochre designs nor the bone tools represent “evidence of organized symbolic systems shared across space and through time”—the hallmark, he believes, of the kind of fully modern behavior seen in the Upper Paleolithic. White says that the earlier finds cannot be equated with full-blown symbolic representations such as the paintings of horses, lions, and rhinos in the Grotte Chauvet.

    Yet White, unlike Klein, doesn't think that the Upper Paleolithic “revolution” was driven by genetic changes. In fact, he and many other researchers agree with Brooks that the Upper Paleolithic explosion was the result of “cultural and not biological changes,” as paleoanthropologist Robert Foley of Cambridge University puts it. In other words, even if some sort of genetic speciation event gave rise to modern brains, that does not mean that fully modern behavior flowered immediately thereafter—in which case the archaeological record may be a poor guide to the timing of genetic and neurological changes.

    “The big bang” of genetically determined cognitive advances “came with modern humans,” suggests psychologist Michael Tomasello of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. But what followed, he suggests, was a “ratcheting effect": cultural evolution fueled by improved transmission of knowledge, especially via language. Says AMNH's Tattersall: “We didn't go from the first blade tool or Blombos geometric engraving to moon shots overnight, and we are still learning new ways to deploy our capacities today.”

    Earlier hominids, adds anthropologist C. Owen Lovejoy of Kent State University in Ohio, “may well have been every bit as intelligent as we are today, but they lacked the shoulders of giants on which to perch.”

    The quest for genes

    While anthropologists and archaeologists opine about how many genetic changes may have led to the birth of H. sapiens, geneticists themselves have been trying to gather data, but they face a difficult challenge. Teasing out ancient DNA is a formidable task that has been successfully performed on only a few relatively recent Neandertals. Fossils as old as 100,000 or more years remain out of reach. And, although a few geneticists are eagerly scanning primate and human genomes for differences in genes and gene expression, especially in the brain (Science, 6 April 2001, p. 44), those studies reveal differences between apes and humans—not what separates modern humans from extinct ones.

    Cognitive leap?

    A segment of the hominid X chromosome (left) was copied onto the Y (center) 3 million to 4 million years ago. It later split and partly inverted (right).

    CREDIT: ADAPTED FROM G. KUKLA, PALAEOGEOGR. PALAEOCLIMATOL. PALAEOECOL., 72: 1 (1989)

    “We are looking at less than a 2% genetic difference between chimps and humans, but vast differences in morphology and behavior,” says paleoanthropologist Mark Collard of University College London. “I don't think the genetic data will be a panacea to solving [the origins of modern humans].”

    Nevertheless, there have been a few recent breakthroughs. Last year in Nature, for example, researchers reported finding a gene directly implicated in the ability to speak—an ability many, though not all, researchers believe is unique to modern humans. Researchers are now probing that gene's evolutionary history to see if it underwent mutations around the time of the birth of H. sapiens (Science, 5 October 2001, p. 32).

    And one group of researchers has set out on what some believe is a quixotic quest to explicitly identify the genes that make us modern. They have even identified a candidate gene that, they say, might be responsible for language and other advanced cognitive abilities.

    Back in the early 1990s, Oxford University psychiatrist Timothy Crow hypothesized that just such a gene, key to language and the brain asymmetries that many researchers believe accompany it, might be located on the sex chromosomes. Such a possibility might seem far-fetched, because there are few functional genes on the Y chromosome and most are involved with male fertility. But Crow already had a good idea about where to look.

    A decade earlier, geneticist David Page's team at the Massachusetts Institute of Technology (MIT) had identified a 4-million-base-pair block of DNA on the X chromosome, called Xq21.3, which was missing from the Y chromosome in all mammals except humans. Page and others found that about 3 million or 4 million years ago—that is, after the chimp and human lines split—this segment was copied onto the Y chromosome of the hominid lineage. The homologous Y chromosome segment then underwent a rearrangement called a paracentric inversion, in which it reversed direction and split into two parts (see diagram). Crow speculated that this second genetic event sparked genetic changes on both the X and Y segments that ultimately led to the speciation of modern humans. He predicted that one or more genes important to brain function would be found in these chromosomal segments.

    Then, 2 years ago, Cambridge University molecular geneticist Nabeel Affara's group reported finding similar functional genes, called PCDHX and PCDHY, in this segment on both the X and Y chromosomes. The genes code for a member of the protocadherin family of proteins, biomolecules that play critical roles in the development of the nervous system. Sure enough, PCDHX and PCDHY “are expressed almost entirely in the brain,” says Affara.

    But because no one has yet been able to date the paracentric inversion, Crow's theory linking it to modern human origins remains speculation. Affara and Crow are working together to better characterize the PCDH genes and what they do, in the hopes of demonstrating that they are crucial to cognitive abilities associated only with modern humans. In the meantime, comments Tattersall, “I take my hat off to them for trying to come up with a mechanism” for the speciation of H. sapiens. “They may be wrong, but we need as many ideas as possible.”

    It may be some time before all of these new ideas in anthropology, archaeology, genetics, and other disciplines come together to create a coherent picture of modern human origins. But researchers are encouraged by the interdisciplinary attempts. “This new research will provide the springboard for a lot of other discoveries,” says Aiello. “We are on our way.”

    There may be few sure answers so far, but one thing seems certain: Sometime during the last 200,000 years or so, evolution blessed us with the wisdom to ask the questions.

  13. BECOMING HUMAN

    Why Get Smart?

    1. Michael Balter

    To us humans, it may seem that smarter is always better. But only once in the history of life on Earth did natural selection favor the evolution of brains sophisticated enough to send people to the moon, paint the Mona Lisa, or wonder about their own origins. However that evolution unfolded (see main text), most anthropologists think that advanced human cognition was no evolutionary accident but an adaptation to a challenging environment.

    Experts have suggested that a series of wild global climate swings, which were especially intense beginning about 250,000 years ago (see graph), forced hominid species to adapt or go extinct. At times the African climate switched from very cold and dry to very hot and humid within a single century. “Imagine these human populations being shoved around by these tremendous changes,” says anthropologist Chris Stringer of the Natural History Museum in London. “This would have had profound effects” on their survival.

    Wild ride.

    Humans had to survive sudden, dramatic climate changes.

    ILLUSTRATION BY C. CAIN

    But if conditions varied so often, what exactly were humans adapting to? To variation itself, says paleoanthropologist Richard Potts of the Smithsonian Institution in Washington, D.C. He argues that human evolution is marked by an increasing ability to deal with change—the result of a process he calls “variability selection”—rather than adaptation to specific habitats. And this ability, he believes, reached its height with modern humans. The hallmarks of Homo sapiens, he says, are “the use of complex symbolic codes and abstraction, [which] presented the potential for behavioral diversification and extraordinarily sophisticated alteration of the surroundings.”

    Many experts believe that this plasticity in responding to new environmental challenges laid the cognitive foundations for our ability to creatively solve new problems—like getting to the moon—that our ancestors never had to face. “The minds of our ancestors were not hardwired with specific strategies for felling mastodons but with more general categories such as 'person,' 'living thing,' 'action,' 'cause and effect,'” says cognitive neuroscientist Steven Pinker of the Massachusetts Institute of Technology. When these categories were recombined in the mind, Pinker adds, “an unlimited number of new ideas … or courses of action could be formulated.”

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution