News this Week

Science  28 Jul 2000:
Vol. 289, Issue 5479, pp. 518
  1. EMERGING DISEASES

    Malaysian Researchers Trace Nipah Virus Outbreak to Bats

    1. Martin Enserink*

    ATLANTAScientists are a step closer to unraveling a medical mystery that killed 105 people in Malaysia last year and destroyed the country's pig industry. The Nipah virus, which caused the disease, most likely originated in a native fruit bat species, Malaysian researchers reported here at a meeting* last week. They say the findings will help Malaysian health authorities prevent future outbreaks of the Nipah virus. Others see the case as an argument for expanding research into infections that can leap the boundary between animals and humans.

    The Nipah virus study highlights the enormous potential of the animal world to produce new human pathogens—the threat of “emerging diseases” that increasingly concerns researchers and public health experts. Three out of every four recent emerging diseases arose from animal infections or zoonoses, according to another study presented at the meeting. And most researchers agree that rather than scramble to identify the source of such outbreaks after they happen, it makes sense to try to survey and understand the plethora of unknown viruses in wildlife before a crisis strikes. “The emphasis in emerging diseases should be in the field,” says Durland Fish, a medical entomologist at Yale University. “As it is now, we wait until we have an epidemic of some new zoonotic pathogen and then try to figure out what it is and why it occurred.”

    The Nipah outbreak, virologists say, followed a classical pattern of havoc and panic. In late 1998 and early 1999, hundreds of pig industry workers in Malaysia—as well as 11 employees of a Singaporean slaughterhouse—came down with a severe form of encephalitis that killed about 40% of the patients. Malaysian health authorities initially mistook the outbreak for Japanese encephalitis, a similar brain infection caused by a fairly common virus. But a team from the University of Malaya in Kuala Lumpur and the Centers for Disease Control and Prevention in Atlanta eventually identified the previously unknown Nipah virus as the culprit (Science, 16 April 1999, p. 407).

    Nipah, named after the town in Malaysia where the first known victim lived, closely resembles a virus called Hendra, which had killed over a dozen horses and two people in Australia in 1994 and 1995. Both belong to a virus family called the Paramyxoviridae. Australian researchers had already shown the Hendra virus to be widespread among three Pteropus bats, which feed on fruit and nectar. These furry creatures, commonly known as flying foxes, are not made ill by the Hendra virus. Nor do people who handle the animals get infected. But researchers suspect that when horses ingest bat urine or placental fluid, they act as “amplifying hosts,” vastly increasing the number of infectious viral particles, which puts humans at risk.

    Suspecting that something similar might be going on with Nipah, researchers last year turned their attention to fruit bats living in a cave close to affected pig farms. They found that two native species—the island flying fox and the Malayan flying fox—carried antibodies to the Nipah virus. But these may have been just infected bystanders. To prove that the bats were a reservoir, researchers had to show that the animals carried live virus.

    This year, a team led by the University of Malaya's Lam Kai Sit studied a large bat colony on Tioman Island, off the Eastern coast of peninsular Malaysia. They collected urine by laying large sheets of plastic on the ground near the bats' roost. After taking over 1000 samples, they detected the virus in the urine of one animal of the species called island flying fox, Lam told the meeting. In addition, they found live virus in a piece of fruit that had been munched on by a bat, suggesting that the virus was also present in the animal's saliva.

    The results prove that the island flying fox is a Nipah reservoir, says Lam. Pigs probably became infected when they ingested bat urine or saliva, perhaps by scavenging half-eaten fruit dropped from a tree, and they became an amplifying host. That's not an unlikely scenario, says Lam, because many Malaysian pig farms have fruit trees. The finding may be key to preventing future outbreaks, says Lam: “We can now encourage pig farmers not to grow any fruit trees in and around the farm.” (Because there aren't any pig farms on Tioman Island, the bat colony there probably doesn't pose a risk, says Lam.)

    The Malaysian work has added a small piece to a large puzzle involving flying foxes, which occur in a broad belt from Australia to India. Researchers suspect that the animals harbor many Paramyxoviridae. In 1997, Peter Kirkland of the Elizabeth Macarthur Agricultural Institute in Camden, Australia, and his colleagues discovered that several flying foxes carried a virus called Menangle, which also infects pigs, causing fetal deformities and stillbirths. During their search for the Nipah reservoir, Lam and his colleagues discovered a virus closely resembling Menangle, which they called Tioman, as well as a third one, which they haven't characterized yet.

    Indeed, every flying fox species may carry its own members of the Paramyxoviridae family, says John Mackenzie, a virologist at the University of Queensland in Australia. At this point, no one knows how many cause disease in humans and animals. But scientists have embarked on a broad study of many bat species. Mackenzie, for instance, has already found antibodies to the Hendra virus in bats from New Guinea, and he's now trying to obtain bat sera from Indonesia, Laos, and India, while others are planning a hunt in Cambodia. “If there are × viruses in bats, let's find them all,” says Mackenzie. “If you know what's out there, it's much easier to diagnose and understand what's happening next time there is an outbreak.”

    The rise of such animal-borne diseases is not unusual, according to a study presented at the meeting by Mark Woolhouse of the University of Edinburgh in the United Kingdom. Woolhouse and his colleagues spent several months “trawling through the textbooks” to prove the common notion that most emerging diseases are zoonoses. They found that humanity is currently plagued by 1709 known pathogens (from viruses and bacteria to fungi, protozoa, and worms). The team concluded that 832 of those, or 49%, are zoonotic. But among the 156 diseases that are considered “emerging,” 114 were zoonoses—a stunning 73%.

    The bottom line? Zoonoses are three times more likely to be emerging than nonzoonotic diseases, and researchers should keep an especially wary eye on animal pathogens, says Woolhouse. Many more dangers may be lurking in the animal kingdom.

    • *International Conference on Emerging Infectious Diseases, 16 to 19 July, Atlanta.

  2. ELECTRONIC OPTICS

    Organic Lasers Promise New Lease on Light

    1. Robert F. Service

    Tiny solid state lasers are big business: Almost $500 million worth of the devices are sold each year for uses ranging from compact disc players to telecommunications equipment. But they have big drawbacks: Many of today's lasers are made from ceramic chips—similar to those at the heart of computer processors—that require expensive clean-room facilities to manufacture, and their color palette is somewhat limited. Researchers have long pinned their hopes on organic materials, which are typically easier and cheaper to process. But to date they have managed to coax organic solids to lase only when blasted with a beam from another laser—hardly a commercial advantage. Now, however, organics have finally begun to shine on their own.

    On page 599, a team at Lucent Technologies' Bell Laboratories in Murray Hill, New Jersey, reports that they've devised the first electrically powered solid state organic laser, a step that could open the floodgates for novel lasers that are cheaper and that shine in colors inorganics can't match. The feat is “big news,” says Yang Yang, an optics researcher at the University of California, Los Angeles. “This is really a milestone. People have tried to make organic electrically pumped lasers for a long time.”

    Beaming.

    Transistor “gate” electrodes on top and bottom cause electrical charges to flow between the two additional pairs of electrodes (top). A voltage applied between transistors then causes these charges to enter the middle layer, where they produce photons (center) that generate a laser beam (bottom).

    CREDIT: A. STONEBRAKER

    To work as lasers, solid materials must function something like an interstate freeway: They must allow lots of traffic—photons in this case—to speed along without hitting potholes, and they must have plenty of on-ramps for electrical charges to get into the device. In conventional ceramic lasers, the on-ramps are metal electrodes placed above and below a semiconductor crystal. When a voltage is applied between the electrodes, electrons flow into one side of the crystal and out the other. The electron vacancies left behind, called “holes,” act like positive charges that can move through the material as an electron on a nearby molecule jumps into the hole, leaving a vacancy where it originated. When electrons coming from one side of the device meet holes coming from the other, they annihilate one another, creating photons in the process. The photons bounce back and forth between mirrors on opposite sides of the crystal, prompting the crystal to release additional photons of the same wavelength. This creates a surge of light, some of which escapes in a beam through a predesigned leak in one of the mirrors.

    Ceramics such as gallium arsenide chips make great lasers because, like a good freeway, they are clean and fast and have easy- access on-ramps. Solid organics, on the other hand, have been more like old country roads: Defects in the materials act like hazardous potholes, trapping photons and causing them to dissipate their energy as heat. And even high-quality organic materials have had big trouble with their on-ramps: Conventional metal electrodes are just too slow at injecting electrons and holes into organics.

    The Bell Labs group—physicists J. Hendrik Schön, Ananth Dodabalapur, and Bertram Batlogg, along with materials scientist Christian Kloc—tackled those two problems in turn. Initially, Kloc used a specialized gas furnace to grow high-purity crystals of tetracene, a molecule that consists of four linked rings of carbon atoms. That gave the researchers the multilane, high-speed freeway they needed for the photons. For their on-ramps, the team did away with the standard electrodes and turned to a pair of transistors, known as field-effect transistors, or FETs. FETs work by applying a voltage to one electrode, called a “gate,” that triggers a flood of electrical charges to flow through a channel between two additional electrodes. Depending on the makeup of the FET, the flowing charges can be either electrons or holes.

    The researchers placed the FETs above and below the tetracene crystal. The bottom FET was designed to flood its channel with electrons, while the top FET sent holes. The team then applied a voltage between the two FETs, which drew the flood of positive and negative charges into the tetracene, where they produced a burst of photons that triggered the lasing process. The scheme worked to perfection, generating a yellowish-green laser pulse. This novel use of FETs “is an important concept, because it allows them to control the charge injection, which is the key to getting this to work as a laser,” Yang says.

    Despite the organic laser's success, it may be a while before organics take over that $500 million market. Growing high-purity organic crystals requires manufacturing processes nearly as exacting as those used to grow conventional ceramic chips. And researchers must also learn how to mass-produce lasers with transistors positioned above and below. Still, Batlogg notes that researchers should easily be able to change the tetracene to other organics to produce a whole range of different colors of laser light. That should give lasermakers something to beam about.

  3. NEUROSCIENCE

    Early Insult Rewires Pain Circuits

    1. Laura Helmuth

    For many years, physicians rarely anesthetized infants or gave them pain-killing medication. They worried that such treatments could interfere with breathing—and they downplayed babies' ability to perceive pain. That's changed in the past 15 years, partly thanks to studies showing that infants respond physiologically and hormonally to pain. A new animal study should amplify the call to manage pain more aggressively in newborn humans: Pain experienced by the youngest infants, the study suggests, could have the longest lasting effects.

    On page 628, neuroscientist M. A. Ruda of the National Institute of Dental and Craniofacial Research (NIDCR) at the National Institutes of Health (NIH) and her colleagues report that painful stimuli delivered to rats shortly after birth permanently rewire the spinal cord circuits that respond to pain. Not only do the circuits contain more axons, but the axons extend to more areas of the spinal cord than they normally would.

    Researchers knew that pain circuits are somewhat malleable in adult animals, but the Ruda team's study shows that “injury to the neonate or fetus can produce changes that are in some way different than [those] in adults,” says neuroscientist Clifford Woolf of Massachusetts General Hospital and Harvard Medical School in Boston. What's more, the NIDCR workers have preliminary evidence that these wiring changes make the animals more sensitive to pain later in life.

    Pain pathways start with sensory neurons in the skin, link to the dorsal horn of the spinal cord, and from there climb to the thalamus and cortex in the brain. To see how painful stimuli affect the spinal portion of these pathways, Ruda and her colleagues injected one hind paw of newborn rat pups with an inflammatory agent that causes the paw to swell and turn red for several days—“kind of like gout” in humans, Ruda explains.

    Some 8 to 12 weeks later, the researchers sacrificed the adult rats and stained their spinal cords with a dye that seeks out pain-sensitive axons. They found about 25% more stained axons in the side of the spinal cord corresponding to the paw that had been inflamed weeks earlier. In addition, the sciatic nerve, which delivers input from the hind limb, projected to six segments of the spinal cord on the treated side, compared to just four on the other side. “You can really see spreading and invasion of these fibers into new areas of the cord,” says molecular neurobiologist David Julius of the University of California, San Francisco.

    Pain changed neuroanatomy only when induced during a distinct developmental window. If the pups were given the noxious injection just after birth or on day 1 or day 3, more neurons became devoted to processing pain. If the researchers waited until day 14, however, they found no neuroanatomical changes. In terms of neurological milestones, day 0 in a rat pup corresponds to about 24 weeks of gestation in a human infant, says Ruda. This suggests that at a very early age, particularly in premature infants, “what's happening could impact the ultimate wiring of the brain.”

    Ruda doesn't know precisely how the stimuli strengthen pain circuits. The extra neural activity could save neurons that would otherwise die, or it could encourage new connections between neurons. But in either case, rats that endured traumatic early days are somewhat more sensitive to pain as adults. Some animals that had experienced the painful injection as pups were injected again as adults. Compared to rats that were injected for the first time as adults, the previously treated ones pulled their paw away from a hot floor faster. Their spinal cord neurons also fired more rapidly in response to a tail pinch in the treated adult rats.

    As for how this study relates to human newborns, Charles Berde, a professor of pediatrics and anesthesiology at Harvard Medical School, says the rat pup is a “useful model,” even though a few days in a rat's life correspond to months of development in a newborn human. Animal research that shows early pain causes hardwired changes might help convince skeptics of the importance of managing pain in newborns, says Berde.

    Kanwaljeet Anand of the Arkansas Children's Hospital in Little Rock, a pediatrician and neuroscientist whose surveys show that babies in intensive care aren't always getting the analgesics they need, agrees. “What we need now,” says Anand, “is an NIH[-sponsored] consensus panel to provide evidence-based guidelines and help standardize the way premature and full-term babies receive analgesic management.” This sort of research, he says, should help dictate those standards.

  4. OCEAN SCIENCE

    Academy Panel Backs Sea-Floor Observatories

    1. David Malakoff

    The National Academy of Sciences wants marine scientists to go deep. An expert panel last week strongly endorsed a network of remotely operated sea-floor observatories to monitor everything from water chemistry to bacteria. The backing may help ocean researchers win a boost in next year's budget proposal now being crafted by the outgoing Clinton Administration.

    The new report* is in response to a request last year by the National Science Foundation (NSF), which asked whether submerged stations packed with sensors—an approach promoted by some marine scientists—are technically feasible and scientifically desirable. The eight-member panel, led by seabed mapper William Ryan of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, answered “yes” on both counts. “Sea-floor observatories present a promising, and in some cases essential, new approach for advancing basic research in the oceans,” it concluded.

    The panel urged NSF to get on with plans for long-term monitoring stations hung beneath a buoy or hitched to abandoned sea-floor cables that can provide power and communications. Such stations, which might be assisted by automated submarines that scan surrounding seas, would augment traditional short-duration, ship-based expeditions and help researchers get a clearer picture of long-term changes in marine environments, says panel vice chair Robert Detrick, a geophysicist at the Woods Hole Oceanographic Institution in Massachusetts. He suggests that NSF could phase in the new program by field testing new sensors and power supplies at a few sites, such as the University of Washington's planned NEPTUNE system that would wire sensors to 3000 kilometers of fiber-optic cable in the northeastern Pacific.

    Expanding such test-beds into a “comprehensive” network of deep and shallow water bases would be expensive, however. The panel estimated that it would cost several hundred million dollars to build and tens of millions a year to operate. NSF's budget couldn't accommodate that “extremely ambitious vision” immediately, says NSF's Mike Purdy, head of ocean sciences. But advocates are hoping that NSF will propose spending up to $30 million in start-up funds as part of its 2002 budget request, to be submitted this fall.

    Some recent high-profile interest in marine research may help their cause. On 28 June President Bill Clinton suggested that Congress consider “shifting another few hundred million dollars to explore the deepest depths.” The speech came a few weeks after Marcia McNutt of the Monterey Bay Aquarium Research Institute in Moss Landing, California, bent his ear on the topic during a White House “millennium” event. And last week, the leaders of a new Congressional Oceans Caucus, including Representatives Tom Allen (D-ME) and James Greenwood (R-PA), told a sea science conference that they will work to strengthen the field next year.

    Another endorsement could come this fall, when the Commerce Department's National Oceans Service is due to deliver a White House-ordered wish list for federal oceans research. Some scientists are pulling for sea-floor research to float to the top of the list.

    • *Illuminating the Hidden Planet: The Future of Seafloor Observatory Science, National Research Council.

  5. DEVELOPMENTAL BIOLOGY

    Embryonic Lens Prompts Eye Development

    1. Elizabeth Pennisi

    A blind cave fish is providing new insight into how eyes come to be. In work reported on page 631, developmental biologists Yoshiyuki Yamamoto and William Jeffery of the University of Maryland, College Park, show that the lens plays a leading role in eye development in this fish. If it doesn't form properly, the researchers found, the embryo will not go on to make the cornea and other eye structures.

    During the 1960s, work in Russia and Spain had suggested such a role for the lens, but this new study “nails it,” says Peter Mathers, a developmental biologist at West Virginia University School of Medicine in Morgantown. What's more, he adds, because the eye develops similarly in all vertebrates, including humans, “the implications are much broader than [for] just the cave fish.”

    The Maryland team has been studying the fish, which is called Astyanax mexicanus, for the past 6 years. Several dozen isolated populations of the species exist in northeastern Mexico, with some living in surface ponds and streams and others in caves and underground waterways. Over the past million years or so, the eyes of the underground fish have degenerated to varying degrees, while the surface fish have retained their large eyes.

    To begin to understand this difference, Jeffery and Yamamoto first monitored eye development in the blind fish. They observed a precursor lens and the rudiments of the optic cup forming during the embryo's first 24 hours. But soon afterward, they found, the cells in the embryonic lens underwent programmed cell death. Other eye structures, such as the cornea and the iris, never appeared, and the retina never developed distinct, organized layers, as it does in normal eyes. The eyeball gradually sank back into the socket and was covered by a flap of skin.

    Because eye development seemed to progress normally until the lens degenerated, Jeffery and Yamamoto wondered whether this disintegration was triggered by a signal from the embryo or from the lens itself. To find out, Yamamoto removed the embryonic lens from one eye of a blind cave fish embryo and replaced it with a lens from a surface fish embryo. He also did the opposite experiment, replacing the lens of an embryonic surface fish with one from a cave fish embryo. In all cases, he labeled the transplanted tissue with dye so he could track what happened to it. “It's not a complicated experiment, but it really [was] very elucidative,” says Mathers.

    In both types of transplants, the lens behaved as if it were still in its original embryo. The one from the cave fish degenerated, even though it was in an environment conducive to further development, whereas the lens from the surface fish thrived in the cave fish embryo and the eye differentiated, forming a cornea, anterior chamber, and iris. These results show that “the lens plays a central role” in determining whether the eye develops, comments David Beebe, a developmental biologist at Washington University School of Medicine in St. Louis. Jeffery doesn't know, however, whether the fish can actually see, as a vision test is quite difficult to devise.

    Other recent work by Jeffery and his colleagues may explain why the lens undergoes programmed cell death in the cave fish. The researchers looked at early embryos for changes in the expression of a variety of proteins that help specify how cells differentiate into specific organs and tissues. As they reported last month in Boulder, Colorado, at the annual meeting of the Society for Developmental Biology, cave fish embryos seem to make more of a protein called Sonic hedgehog in the area destined to be the head. As a result, fewer cells are set aside to form the eyes (Science, 23 June, p. 2119). Jeffery suspects that with fewer cells to start with, the precursor lens may wind up smaller than usual, perhaps too small to survive, and therefore decays. “It's possible you are looking at a single gene defect that has caused a drastic developmental change,” Mathers notes.

    Still unclear, however, is how the embryonic lens of the sighted surface fish triggers further eye development. Presumably the lens produces a molecular signal, which Jeffery and his colleagues hope to identify eventually. They also hope to pinpoint the genes involved in eye development in A. mexicanus. Studying different populations of the fish may provide clues to these genes, notes Beebe: Because populations became isolated when the fish could see and became blind independently, different mutations may be involved in each population.

  6. EUROPEAN SCIENCE

    Urgent Call for Research Overhaul

    1. Robert Koenig

    BERNThe European Union is in danger of losing ground in the global research competition unless its member nations devote more resources to science, restructure the E.U.'s flagship research program, and develop a Europe-wide science strategy, an expert panel says. The recommendations are music to the ears of E.U. Research Commissioner Philippe Busquin, who has been arguing for major changes along those lines.

    In a report issued on 20 July, the 11-member panel calls for “an urgent re-engineering of the overall management and administration” of the E.U.'s Framework Program. Framework 5—which provides $17 billion over 5 years for multinational research efforts and scientific networking—should be made more flexible to respond to hot new research fields, the report suggests, and its complex grant-application procedures should be made “much simpler and easier to understand.”

    SHORING UP THE FRAMEWORK

    Urgently reengineer the Framework Program's management

    Better coordinate E.U. member states' R&D policies

    Increase R&D spending throughout E.U. to 3% of GDP within 10 years

    Encourage researchers to submit proposals for “riskier” projects

    Taking a broader perspective, the panel—scientists, academics, and business leaders from 11 E.U. states—contends that the Framework Program (which accounts for only 5% of Europe's total spending on research) by itself cannot chart a course for European research. They recommend that member nations find better ways to coordinate national research efforts. The panel, appointed by the European Commission, also calls on E.U. member nations, which now spend an average of about 2% of their gross domestic product on R&D, to step up public and private research investment to “at least 3%” over the next decade.

    The E.U. and its member states currently pay research short shrift, says panel chair Joan Majó, a former Spanish industry minister. “Science is becoming so important for Europe now that it can't be left only to the national research ministers,” Majó, an engineer by training, told Science. Although Framework plays a key role in promoting collaboration across Europe, he says, “we need to improve that program and also take wider initiatives to coordinate European research.”

    Some of the panel's recommendations dovetail with Busquin's effort to develop a “European Research Area” (ERA) to help coordinate what he called the “fragmentation, isolation, and compartmentalization of national research efforts” in the E.U.'s 15 member states (Science, 21 January, p. 405). In a statement, Busquin said he agrees with the Majó report's overall thrust. E.U. research programs alone, he says, “will not be enough to meet the challenges faced by European research.”

    The Majó report—which examined Framework programs from 1995 to 1999—tapped a vein of frustration among bench scientists. “We found many researchers who are concerned about the excessive bureaucracy and about the means of evaluation,” says Majó panel member Jeanne E. Bell, a neuropathologist at the University of Edinburgh in the U.K. Nearly two-thirds of the 2275 scientists and others who responded to a questionnaire about the Framework programs said they thought “the whole application process was too slow and/or costly.”

    Similar frustrations underlie a second report issued last month by a separate expert group that focused more on the role of the E.U.'s Joint Research Center. The report, by an eight-member group chaired by Viscount Etienne Davignon, criticized the way the E.U. decides which areas to fund under Framework. “In the past, the task has been under-resourced, and too frequently influenced by budgetary and political—rather than scientific—considerations,” said the panel.

    Revamping big programs is not easy in the E.U., but with support from Busquin, the new reports may have an impact on the development of the next Framework Program, which begins in early 2003. Busquin says he wants a “thorough rethink” of plans for Framework 6, with the ERA one of the templates for planning. Other changes are already in the works. Research ministers of the member nations, meeting last month in Lisbon, gave Busquin the green light to pursue several ERA initiatives, including efforts to better network European research centers, increase the mobility of researchers, and conduct a “benchmarking” study of European research.

    Majó, who heads the Catalonian Institute of Technology in Barcelona, says a broad perspective is needed. “The absence of research policy is due to the lack of a real strategy for the future of Europe,” he says.

  7. BIOMEDICAL RESEARCH

    Hughes Grants Target Infectious Diseases

    1. Richard Stone

    Marcelo Briones studies Chagas' disease, a chronic and debilitating illness affecting 18 million people in Latin America. But this week, when he felt his knees go weak, it wasn't from contemplating the terrible human suffering wrought by the parasite. Briones, of Federal University in Säo Paulo, Brazil, had just learned he would be getting a 5-year grant from the Howard Hughes Medical Institute under a new program that funds 45 scientists in 20 countries. The $15 million initiative, which supports research on a variety of infectious and parasitic diseases, marks the first Hughes program outside the United States that is tailored to a specific research area.

    The program builds upon two highly praised regional initiatives, one in Eastern Europe and the other serving Canada and Latin America, that support individual scientists working in a broad range of fields. The charity saw an opportunity to prop up an underfunded area, says institute president Thomas Cech. “The economic incentive for research [on these diseases] by large pharmaceutical companies is very limited,” he says. “They may never recoup a large research investment by future sales.”

    Experts say they are surprised that the institute, traditionally a bastion of basic research, is venturing into a more applied arena. But “the more the merrier,” says Richard Lane, head of International Programs at The Wellcome Trust charity in London, which itself supports much work in the area.

    The grantees, chosen competitively, say the Hughes award will allow them to do work that might never have been funded by their national programs. Malaria researcher Ross Coppel of Monash University in Victoria, Australia, wants to examine the enzymes that build the thick and waxy cell walls of mycobacteria, the type of bugs that cause tuberculosis and leprosy. Knocking out one or more of these enzymes could make these bugs more vulnerable to antibiotics. “Granting agencies are often loath to support investigators who are making a major switch of this sort,” says Coppel, one of 11 Australians, the most from any one country (see pie chart).

    The 5-year duration also gives researchers the luxury to travel down paths they might otherwise have ignored. “This gives me the security to try some really ambitious approaches without having to worry about a renewal after just 2 years,” says Geoff McFadden of the University of Melbourne in Parkville. Building on work showing that herbicides kill the malaria parasite in culture, McFadden is investigating the novel idea that herbicides might work as human drugs by targeting the chloroplast found not only in the malaria parasite, but in related protozoa that cause diseases such as toxoplasmosis and coccidiosis.

    Besides paying for supplies and equipment, the awardsranging from $225,000 to $450,000 a year are also expected to help support hundreds of young scientists and to strengthen the scientific infrastructure in participating countries. Thomas Egwang of the Medical Biotechnology Labs in Kampala, Uganda, who recalls “breaking into a grin and punching the air in delight” upon hearing about his grant, will teach how to apply advanced molecular biology techniques to studies of river blindness, a fly-borne parasitic disease that afflicts as many as 20 million people worldwide. Deidre Carter of the University of Sydney notes that the grant is especially welcome “at a time when morale in Australian universities and the research community is very low.”

    Briones plans to use some of his grant money for travel. “I want to go to high-quality meetings where I can meet people smarter than me,” he says.

  8. GENOMICS

    Wellcome Trust Backs Genome Computation

    1. Elizabeth Pennisi

    What began as a one-man crusade now has the weight of the world's largest medical research charity behind it. Last week the U.K.-based Wellcome Trust announced that it would spend $13 million over 5 years to fund hardware and software designed to analyze newly sequenced human DNA. Its support of a project called Ensembl (http://www.ensembl.org/) reflects a growing appreciation of the importance of computers to interpret the human genome.

    Ensembl was started in early 1999 by Tim Hubbard, a bioinformaticist at the Sanger Centre near Cambridge, U.K. He began developing computer programs to sort through the vast amounts of data generated by sequencing efforts, which simply determine the order of bases—A, G, T, and C—along each chromosome. However, the sequence has little value or meaning until scientists locate the genes these bases encode and figure out their functions. By midyear, Hubbard had teamed up with Sanger's Michele Clamp, Ewan Birney of the European Bioinformatics Institute (EBI), also in Cambridge, and a few other colleagues to set up automated preliminary analysis of the rapidly emerging rough draft of the human genome.

    “The Ensembl budget was cobbled together,” Birney recalls. “We were working off bits and bobs of other budgets.” By making their rudimentary analysis available to everyone, they also hoped to prevent the genome from being patented by private concerns.

    The Wellcome money will put the project on much firmer footing. It allows the 10-person Ensembl staff to triple over 5 years and greatly increases its computing capacity, adding “the equivalent of hundreds, perhaps thousands, of personal computers,” notes EBI's Graham Cameron. This investment “will speed up the annotation of the human genome,” predicts David Haussler, a bioinformaticist at the University of California, Santa Cruz. “It puts them in a better position to tackle the large bioinformatics problems that are looming.”

    The new funds will be split between the Sanger Center and EBI, which is an outpost of the European Molecular Biology Laboratory in Heidelberg, Germany. The money arrives at a critical time for EBI, one of the world's three archives of genomics data, whose budget has been hit by changes in the European Union's policies for supporting scientific infrastructure (Science, 25 February, p. 1401). It's also “a vote of confidence [in the field] and a commitment by the Wellcome Trust,” notes David Lipman, director of the National Center for Biotechnology Information in Bethesda, Maryland.

    The award reflects a growing interest in bioinformatics by funding agencies. Haussler, for example, is one of 12 computational biologists who have just been appointed as investigators for the Howard Hughes Medical Institute. The U.S. National Human Genome Research Institute plans to create a network of centers of excellence, several of which will focus on computational biology. The award to Ensembl also kicks off the Wellcome Trust's $150 million initiative in functional genomics, which follows the recent completion of the rough draft of the human genome. “We didn't feel we could wait,” says Celia Caulcott, a Wellcome Trust program manager.

  9. EVOLUTION

    Parasites Make Scaredy-Rats Foolhardy

    1. Carl Zimmer*
    1. Carl Zimmer is the author of Parasite Rex, to be published in September.

    Long before The X-Files, Robert Heinlein wrote about parasitic aliens that alter human minds. In his 1955 novel, The Puppet Masters, slug-shaped creatures arrive on Earth and clamp themselves to people's spines, forcing their hosts to help spread their kind across the planet. Although the rabidly anticommunist Heinlein may have been less interested in biology than in finding an allegory for the Red Menace, The Puppet Masters proved scientifically prophetic: Some parasites, it turns out, alter the behavior of their hosts for their own benefit.

    In the 7 August issue of the Proceedings of the Royal Society of London B, researchers at Oxford University offer a striking demonstration of this ability by the protozoan Toxoplasma gondii. Rats, the intermediate hosts of Toxoplasma, appear to lose their fear of cats when the parasite infects them. And cats, not coincidentally, are Toxoplasma's final host. By precisely altering rat brains, the parasite potentially increases its chances of completing its life cycle. “They certainly have demonstrated that the parasite is changing behavior in a rather specific way,” comments Hilary Hurd, a parasitologist at Keele University in the United Kingdom. “It's fascinating that this happens.”

    What makes the story all the more fascinating is that Toxoplasma is extremely common in humans. Perhaps half of all people on Earth carry its cysts in their brains without visible effects. (It is dangerous only when it invades a host with a weak immune system, such as AIDS patients or fetuses, where it can cause brain damage or even death.) Recent research has hinted that even in this latent form, however, Toxoplasma may create subtle changes in personality.

    The relatively innocuous Toxoplasma is an unlikely candidate for a mind bender. Dwelling in a cat's bowels, it produces egglike oocysts that leave its host's body along with the feces. The oocysts can survive in soil for decades, waiting for a rat or some other warm-blooded mammal or bird to pick them up. Once inside an intermediate host, the parasite invades cells and replicates. Toxoplasma elicits a strong immune response, which prompts the parasite to form tough-coated cysts in which it finds refuge until its host happens to be eaten by a cat. The mildness of Toxoplasma's effects on its intermediate host make good evolutionary sense: It's not in the parasite's interest to be lethal, as cats find dead animals distasteful.

    Yet a parasite's gentleness need not mean that it's passive. Since the 1960s, parasitologists have documented various ways in which parasites may alter their intermediate hosts to improve their chances of infecting a final host. The lancet fluke Dicrocoelium dendriticum, for example, forces its ant host to clamp itself to the tip of grass blades, where a grazing mammal might eat it. Another fluke, Euhaplorchis californiensis, causes infected fish to shimmy and jump, greatly increasing the chance that wading birds will grab them. To see if Toxoplasma might somehow increase its chances of getting into a cat, the Oxford team, led by zoologist Manuel Berdoy and parasitologist Joanne Webster, set up a maze with a nest box in each corner. On each nest they added a few drops of a particular odor: eau de rat's nest, fresh straw bedding, rabbit urine, or cat urine.

    When the researchers set healthy rats loose in the maze at night, the curious animals shied away from the cat odor and were unlikely to return to that part of the enclosure later in the night. The researchers then put Toxoplasma-carrying rats in the enclosure. In previous experiments they have shown that infected rats are for the most part indistinguishable from healthy ones: They can compete for mates just as well, keep their rank in the rat hierarchy, and have no trouble feeding themselves. In the latest experiment the researchers found only one difference: The scent of a cat had no effect on them. They would explore the nest treated with cat urine at least as often as anywhere else in the enclosure. In some cases, the rats even had a fatal attraction to the cat scent.

    The specificity of Toxoplasma's effects argues against some general pathology. Because both infected and noninfected rats preferred rat reek to rabbit, “that reaction to predator odors is not due to an impairment of the [sense of] smell,” says Berdoy. Instead, he speculates, Toxoplasma cysts may release a compound that interferes with a rat's own neurotransmitters, short-circuiting neurological pathways that would keep the rat out of danger.

    Hurd says Berdoy's work does not close the book on Toxoplasma, however. “One of the key elements that they haven't demonstrated is whether it actually works, whether the host really is predated more because of this behavior,” she says. “This is interesting, but it's really only the beginning.”

    If Toxoplasma finds its way into a human instead of a rat—people can pick up the parasite by handling litter boxes, eating undercooked meat, or gardening in oocyst-laden soil—it has no hope of completing its journey, because cats don't eat people. But there's some evidence that it may alter its host's behavior.

    Parasitologist Jaroslav Flegr of Charles University in Prague administered psychological questionnaires to people infected with Toxoplasma and controls. Those infected, he found, show a small, but statistically significant, tendency to be more self-reproaching and insecure. Paradoxically, infected women, on average, tend to be more outgoing and warmhearted than controls, while infected men tend to be more jealous and suspicious. In the current issue of Biological Psychology, Flegr reports that these personality differences appear to become greater as people are infected for longer periods. Others are not yet convinced. Robert Simon, a psychologist at the University of Delaware in Newark, calls Flegr's work “courageous” but hardly conclusive. “I don't know for sure what to make of it; we need more people looking at [these correlations].”

    Even if the changes are real, people who carry the parasite are hardly likely to throw themselves at lions. But if Flegr's findings hold up, they are a very personal reminder of the ways in which parasites try to control their destiny.

  10. PARTICLE PHYSICS

    Elusive Particle Leaves Telltale Trace

    1. Charles Seife

    Nearly massless and incredibly rare, the tau neutrino scorns its surroundings, seldom interacting with more common matter. These properties make it difficult to detect. Now, an international team of physicists has laid claim to the first “direct” detection of the tau neutrino. Scientists had already confirmed indirectly that the particle exists, but “it was an experimental major success,” says Gordon Kane, a physicist at the University of Michigan, Ann Arbor.

    Making tracks.

    About one tau neutrino in a trillion reacted with an iron nucleus, creating a tau particle that left its signature in emulsion plates.

    ILLUSTRATION: C. CAIN

    Neutrinos were discovered after scientists failed to balance their subatomic books. In the 1930s, Wolfgang Pauli proposed that a very lightweight, weakly interacting particle was carrying away the energy that was missing from radioactive decays. The existence of the neutrino was confirmed a few decades later. Physicists believe there are three types of neutrinos, each named for the fundamental particle it interacts with: The electron neutrino interacts with electrons, the muon neutrino with muons, and the tau neutrino with taus. (Some theories posit other varieties of neutrinos, such as the so-called “sterile” neutrinos, but nobody knows whether they exist.) When physicists have fired beams of electron neutrinos at a target, they produce electrons. Likewise, muon neutrinos shot at a target generate muons. But no one had observed this for tau neutrinos.

    At the Direct Observation of the Nu Tau (DONUT) experiment based at the Fermi National Accelerator Laboratory near Chicago, scientists tried their hand with an 800-giga-electron-volt proton beam. When the beam smashed into a target, it created all manner of subatomic particles, including, presumably, tau neutrinos. The neutrinos then passed through meter-long steel targets. One out of every trillion tau neutrinos interacted with an iron nucleus and created a tau particle, which, in turn, left a telltale track on layers of emulsions that acted like photographic plates. The yield: four taus that the DONUT team is quite confident came from tau neutrinos.

    “It was a hard experiment, an expensive experiment, and a somewhat unfashionable experiment,” says Stanford University physicist Martin Perl. Physicists already knew that tau neutrinos existed, from missing-energy analysis of tau particles, so some scientists saw no need to perform it at all. Perl disagrees. “It was very, very important to find out,” he says. “Not only does it confirm [the tau neutrino's] existence, it shows that it interacts in a more or less normal fashion.” DONUT team member Regina Rameika agrees. “It's just a relief, really,” she says. “It's kind of one of those things you had to do.”

  11. LIVING WITH A STAR

    Controversy Flares Up Over NASA Solar Project

    1. Andrew Lawler

    Solar scientists are scrambling to put their stamp on NASA's new Living With a Star initiative, a billion-dollar program to study the sun that faces obstacles in Congress

    Ancient astronomers thought the sun was the most important object in the heavens. But in recent times, solar astronomy has been left in the shade by dramatic images of celestial wonders ranging from colorful nebulae to channels cut by springlike seeps on Mars. NASA, the primary federal source of funding for studies of the sun and its impact on the solar system, devotes only about 10% of its annual $2 billion space science budget to such research.

    This year, however, was supposed to be solar physicists' moment in the sun. In February, the president requested a $20 million downpayment on a 12-year, $1-billion-plus effort, called Living With a Star, to launch a flotilla of satellites to study the sun and the streams of particles it hurls into space. The data are expected to give researchers critical insight into the sun's inner workings as well as a window on space weather, which has a profound effect on Earth's climate as well as terrestrial communications. The program seemed to have everything going for it, including the backing of space scientists, NASA chief Dan Goldin and the White House, and influential senators.

    But instead of ushering in a new dawn for solar science, the initiative has become mired in controversy that includes a bureaucratic tug-of-war, a debate over research goals, and questions about the propriety of a lucrative contract to manage it. The saga shows how, in the trenches of Washington politics, what seem like assets can quickly turn into liabilities, and how researchers must compete with other interests for organizing and running a big science program. NASA officials are convinced that the project will survive, but the rough-and-tumble politics have upset and perplexed the effort's scientific supporters, a community generally naïve in the ways of Washington. “I thought I was buying a ticket to the ballet, but I ended up at a wrestling match,” says Arthur Poland, the lead scientist for sun-Earth programs at NASA's Goddard Space Flight Center in Greenbelt, Maryland.

    Solar Sentinels

    Focus: Solar surface, wind, and seismology

    Cost: $600 million

    Spacecraft: Five

    Launch Date: 2008-09

    What Poland and other researchers have proposed is a network of satellites ringing the sun and Earth that would monitor solar variability, solar wind, and the interactions of the sun with Earth's magnetosphere and ionosphere (see gallery of images). The first mission, a spacecraft with four main instruments to study solar dynamics, would be launched late in 2006. Two years later, NASA would begin launching several satellites to examine how the sun affects Earth's magnetic field and atmosphere, followed by a series of spacecraft that would closely circle the sun and study the solar cycle. “This will provide terrific data and great opportunities for scientists to understand space weather,” says Richard Behnke, a program manager at the National Science Foundation (NSF). The price tag is estimated at $500 million over the next 5 years and between $1 billion and $1.5 billion over its lifetime, according to Gilberto Colon, the Goddard program manager.

    The idea for such a network goes back to the mid-1980s. But other missions with wider popular appeal, like the Hubble Space Telescope or Mars Pathfinder, repeatedly pushed it down the priority list. “We are a field accused of studying wiggles on a graph,” says Dan Baker, a space physicist at the University of Colorado, Boulder. “To convey our work in a visual way was difficult.” Graduate students were drawn to more vibrant fields, leaving in place gaps created by a spate of retirements. In addition, the field's interdisciplinary nature hindered an effective grassroots lobbying campaign. As a result, as other areas of space exploration blossomed, Baker laments, “we were going out of business.”

    The turnaround came after Europe's Solar and Heliospheric Observatory (SOHO), launched in 1995, began returning stunning pictures taken by a Goddard telescope. Other small spacecraft have since filed other images. Researchers hoped to parlay the popularity of those pictures and interest in the current peak in solar activity into a 2002 budget initiative. But NASA did them one better. A presentation last August by George Withbroe, who manages NASA's sun-Earth programs, was so successful that Goldin decided to jam Living With a Star into this year's request, and the White House agreed.

    Solar Dynamics Laboratory

    Focus: Interior, dynamics of solar atmosphere

    Cost: $300 million

    Spacecraft: One

    Launch Date: 2006

    Home-field advantage

    Maryland politicians, apprised in January of the new initiative, were enthusiastic. With a nod from the White House, Democratic Senators Barbara Mikulski and Paul Sarbanes announced the effort just before President Bill Clinton released his budget request on 7 February. “This means jobs today and jobs tomorrow,” declared Mikulski. Nine days later, Goddard managers published a notice of their intent to award a sole-source contract to manage the project to the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland. The contract, according to the notice, would run for 12 years and be worth $600 million.

    The announcement upset much of the solar science community. Goddard scientists, caught by surprise, wondered if the arrangement signaled a diminished role for their center. Industry officials complained that they were being blocked from competing for the contract. Republican House members bridled at a major government program moving forward without competition. The notice even rattled the White House, which sought an explanation.

    Radiation Belt Mappers

    Focus: Origin and dynamics of radiation belts

    Cost: $150 million

    Spacecraft: Two to six

    Launch Date: 2008

    “It was terrible,” says Andrew Christensen, chair of NASA's sun-Earth advisory panel and a space physicist at The Aerospace Corp. in El Segundo, California. The decision, he says, “unfortunately has politicized the program.” Judith Karpen, chair of the American Astronomical Society's solar physics division and a Naval Research Laboratory researcher, warned that “the likelihood for success for any mission will be greatly compromised” if APL is given control over the initiative. In a 3 March letter to William Townsend, Goddard deputy director, she also noted Goddard's success in planning and managing previous sun-Earth missions, including SOHO, and criticized “the unprecedented degree of secrecy” surrounding the choice of APL.

    Researchers, industry lobbyists, and congressional staffers see the arrangement as a bid by Goldin to curry favor with an influential legislator—Mikulski is the ranking Democrat on NASA's spending panel—by propping up a key research facility in her state. APL, with 3000 employees, has seen its mainstay military contracts dwindle in recent years. As evidence, an industry source cites a meeting this spring with Mikulski in which the senator told corporate leaders to accept the fact that APL had won. “We were told not to disrupt the program,” adds one industry official. But Mikulski aides dismiss such talk. “There was no deal,” says a spokesperson. “She has nothing to do with assigning contracts.”

    The arguments over the contract quickly caught the attention of House Republicans. After getting wind of industry and research community concerns, Representative James Sensenbrenner (R-WI), who chairs the House Science Committee, asked NASA's Inspector General (IG) in April to look into the matter. In June, at Sensenbrenner's urging, the House spending panel with oversight of NASA's budget denied funding in part because of its concerns surrounding the contract. Earlier this month, NASA's IG issued a report finding “insufficient justification for NASA's decision to award this contract on a sole-source basis to APL.” Last week, Sensenbrenner wrote a letter to Goldin asking him to “remove the cloud of uncertainty” hovering over the program by holding a competition.

    But, true to the smoke-and-mirrors nature of Washington politics, some congressional sources say the House criticism is not what it seems. Instead, they see the attacks as part of an effort to win concessions from Mikulski and her Senate colleagues on other programs when the two bodies meet this fall to hammer out NASA's 2001 budget.

    For their part, NASA and APL managers say that the criticism is misguided. “The IG's findings are a huge misunderstanding” riddled with “factual errors,” says Tom Krimigis, a magnetosphere physicist and chief of APL's space department. A NASA response to the study labels it “inaccurate.” NASA managers say they haven't ceded control over the project to APL, and that headquarters will decide where individual spacecraft will be built. “A lot of people assume this [means the initiative] is going lock, stock, and barrel to APL,” says one agency official. “That's not the case.”

    Instead, agency officials say the decision to award a management contract to APL, which has a long history of managing space projects, is intended to ensure that the Maryland lab retains its space capabilities over the next decade. Other organizations, such as APL's archrival, the Jet Propulsion Laboratory in Pasadena, California, have received similar contracts, they add.

    Applied backlash

    Another controversy over the program also stems from what seems at first glance like an asset. When NASA officials briefed staffers on Sensenbrenner's panel in February, they emphasized the potential applications that could flow from the program, including the ability to issue timely warnings of pending communications outages due to solar storms. That strategy appears to have backfired, however. Republican staffers “came unglued” by all the talk about benefits, according to one participant. If the effort was about applications rather than basic research, the staffers argued, then the Defense Department and the National Oceanic and Atmospheric Administration should help pay for it.

    Ionospheric Mappers

    Focus: Effects on Earth's atmosphere

    Cost: $150 million

    Spacecraft: Two to six

    Launch Date: 2009

    NASA officials, wary of interagency programs after a long and bitter battle over control of the remote-sensing Landsat satellites (Science, 30 June, p. 2309), were horrified by the suggestion. Researchers were equally dismayed, flashing back to a protracted debate in the mid-1990s over the relative merits of basic and applied research. Living With a Star, they say, is an effort to understand the complex interactions of the sun and Earth and is firmly rooted in basic research. “There is elegant science to be done,” says Karpen. But the applied side should not be ignored, she adds, contrasting it with other fields of astronomy that “have nothing to do with whether your cell phone works.” NASA and outside researchers are trying to repair the damage, but the House spending bill takes the program to task for its emphasis on applications.

    Yet another challenge is the short time available to flesh out the program's details and win the research community's full backing. The accelerated timetable has left many researchers feeling left out of the process. “Many people are miffed,” says one. The current Goddard plan has not been well received by many outside scientists, who worry that the myriad spacecraft and instruments don't add up to a coherent package. “The community is delighted with the idea of Living With a Star, but there is room for reexamination,” says NASA adviser Christensen. “There is a feeling we need to take a more systematic look.”

    NASA's Withbroe acknowledges that tension. “People are not terribly happy out there,” he says. To address that concern and to avoid the kinds of mistakes that have hampered NASA's Mars program, the agency is creating an independent advisory panel to help develop a clearer and more acceptable plan. “We have the building blocks, and now we want to have a set of architects make sure they fit together,” says Withbroe.

    With the program ensnarled in controversy, outside researchers face the task of mobilizing a field that has never before been asked to go to bat for a program of this magnitude. “I don't think this community is very effective,” says Louis Lanzerotti, a space physicist with Lucent Technologies in Cherry Hill, New Jersey. “And it's a damn pity.” Colorado's Baker expects the controversy to be a learning experience for most researchers. “Only a few people in the community have been [politically] active,” he says. “For the most part, people have been a little too content to let things play out.”

    Observers predict that Mikulski will triumph this fall in winning funding for the program and for APL. If that happens, the next step will be to maximize the project's scientific value without alienating the politicians who foot the bill. “We really don't like this [APL] deal—we think it stinks—but we don't want it to sink Living With a Star,” says one solar physicist. Adds Christensen: “We want to get the program approved, so we don't want to torpedo it by being too negative.”

    Proponents are rooting for the solar community to demonstrate that it can play in the scientific big leagues. “It's a great program, and it's a real shame it started off on the wrong foot,” says NSF's Behnke. “Let's hope it recovers.”

  12. 5TH INTERNATIONAL ANCIENT DNA CONFERENCE

    Divining Diet and Disease From DNA

    1. Erik Stokstad

    MANCHESTER, U.K.—Some 110 scientists from a range of disciplines gathered in the overcast British midlands for the 5th International Ancient DNA Conference, held here from 12 to 14 July. Among the attractions were successful DNA extractions from human dung and from mammoth bones.

    Tales of Paleofeces

    What you eat can reveal a lot about your lifestyle and health. That's why archaeologists try to piece together ancient diets by picking through campsites for bones, seeds, and other leftovers. What you excrete can reveal even more. It's no piece of cake, however, to identify paleofeces or coprolites—the polite words for ancient poop—as human. Even then, the contents may have been chewed and digested beyond recognition. At the meeting, researchers reported that DNA can not only peg paleofeces as human, but also lay bare a wealth of hidden information on diet. “This is really neat,” says longtime paleofeces examiner Karl Reinhard, an archaeologist at the University of Nebraska, Lincoln. “It will expand our ability to identify the total diet and use of natural resources.”

    Dry, cool caves were not only a good place to live in prehistoric days, they would seem to offer decent enough environmental conditions to preserve DNA. Still, the hunt for DNA from cave samples of paleofeces proved frustrating until 2 years ago, when Hendrik Poinar and Svante Pääbo, then of the Max Planck Institute for Evolutionary Anthropology and Zoological Institute in Munich, Germany, pioneered an approach to release DNA trapped inside the 20,000-year-old dung of extinct ground sloths (Science, 17 July 1998, p. 402). The secret was a compound called N-phenacylthiazolium bromide (PTB), which cleaves sugar bonds that entangle DNA and prevent its amplification—a particular problem for sugar-rich paleofeces. Since then, Poinar—now at the Max Planck Institute for Evolutionary Anthropology in Leipzig—and his colleagues have shown that the technique works for paleofeces of an extinct mountain goat and a different ground sloth from Patagonia, Chile. But their failed attempts to extract DNA from Egyptian mummies had discouraged them from tackling human paleofeces.

    Then Kristin Sobolik, an archaeologist at the University of Maine, Orono, suggested they try paleofeces from Hinds Cave, a well-studied prehistoric rock shelter in the Lower Pecos River area of Texas. Thousands of specimens, ranging in age from 8500 to 500 years old, have been collected, but only a few analyzed. Poinar was game, so Sobolik sent the lab five of the fist-sized nuggets ranging in age from about 2100 to 2400 years old.

    The PTB method worked like a charm. Poinar pulled out human mitochondrial DNA and found sequences, called haplogroups, that are known to be Native American. (An independent lab has replicated the findings.) The group next extracted chloroplast DNA, from which they matched sequences to buckthorn, acorns, sunflower, a shrub called ocotillo, and a kind of nightshade, probably wild tobacco. Sobolik examined the samples under a microscope but could see no remnants of these plants. (On the other hand, cacti and rodents found by Sobolik did not show up in the molecular analysis.) Both the DNA and visual methods identified traces of legumes, yuccas, and elm, which may have been used to brew tea.

    The paleofeces also contain visible bones of pack rats and mice, as well as fish scales. Poinar didn't find DNA from these, perhaps because the samples that he tested lacked the tiny bone fragments. However, he did find sequences for sheep and pronghorn antelope, bones of which have not been found in Hinds Cave. That suggests that the large game was killed and eaten elsewhere, Poinar says.

    The paleofeces findings show that prehistoric American diets were “incredibly diverse,” Sobolik says. One paleofeces alone contained remains of four kinds of animals and three plants, while another had two animals and four plants. If archaeologists and molecular biologists work together, Poinar says, “we get a more complete picture of what the hunter-gatherers ate.”

    Next the team hopes to track diet over time. “There was a continual flow of dung over 8000 years,” Poinar notes. Previous studies indicate that diet changed little over those centuries, a hypothesis that DNA may be able to test. For instance, the researchers might be able to detect changes in big game abundance. And two other caves nearby also contain ancient dung. The hunter-gatherers are thought to have switched caves during different times of the year to exploit different resources, such as fall acorns in the upland forests. If so, the paleofeces might reveal seasonal changes in diet.

    What many archaeologists would especially love to know, however, is the sex of a defecator. This could provide insights into dietary differences between males and females, Sobolik notes. She and others have attempted to determine the sex in some paleofeces by the ratio of testosterone and estradiol. Analysis of nuclear DNA, of course, would give a much less ambiguous answer. Nuclear DNA is harder than mitochondrial DNA to amplify, because it is so much scarcer, but Poinar says he is working on it.

    Hunting a Mammoth Killer

    Last year, molecular biologist Alex Greenwood and his colleagues at the American Museum of Natural History (AMNH) in New York City prised the first sequences of nuclear DNA from an extinct species, the woolly mammoth. Now they are hoping to use the mammoth as a vehicle for pioneering a new field: paleovirology.

    As a first foray into the relatively uncharted area of ancient viral genetics, Greenwood announced at the meeting that his group had managed to pull out of a mammoth cell's nucleus partial sequences of a class of endogenous retroviruses (ERV). An ERV is born when a retrovirus sneaks into an egg or sperm and sets up camp in its DNA. This has happened many times in all creatures, from yeast to people. Because ERV sequences—some of which are thought to have maintained their activity or acquired functions—appear to be strongly conserved in living mammals, experts caution that ancient sequences probably don't represent a potential gold mine of new information on viral evolution.

    That's fine, because Greenwood and his AMNH colleagues have bigger game in mind. They want to probe a novel idea about what might have driven mammoths, giant ground sloths, and other large mammals to extinction in North America at the end of the last ice age. The two leading ideas are that rapid climate change sharply curtailed food supplies and shrank populations below sustainable levels, or that overhunting by early Americans did the job. The new idea comes from AMNH mammalogist Ross MacPhee and virologist Preston Marx of the Aaron Diamond AIDS Research Center in New York City, who in 1997 proposed that pathogens brought across the Bering land bridge by humans or commensals such as dogs jumped into mammoths with the infectivity of the flu and the lethality of Ebola. Although the idea is provocative, the researchers know of no modern pathogen that would fit this description—thus complicating any hunt for such a shadowy rogue.

    That doesn't mean it's not worth searching for evidence of a so-called “hyperdisease.” Woolly mammoth flesh—sometimes even whole carcasses—is found deep-frozen in permafrost in Alaska and Siberia. MacPhee's team has collected samples from Alaska, mainland Siberia, and Wrangel Island in the East Siberian Sea. Tissue from individuals ranging in age from 26,000 years to 4500 years was well enough preserved to yield nuclear DNA sequences, which were published last November in Molecular Biology and Evolution. This was the first proof that it's possible to extract single-copy nuclear DNA from extinct animals. This kind of DNA is much rarer than the plentiful DNA of mitochondria, the miniature powerhouses inside animal cells that have their own genome.

    Finding a pathogen lurking in frozen mammoth tissue will be much trickier. Their DNA may be even scarcer than native mammoth DNA, and the pathogen may not have infected bone, the most common fossilized tissue. “It is mostly a matter of luck,” says Greenwood. “You have to find an infected individual who had a high enough viral load that is detectable.”

    Greenwood and MacPhee plan to search mammoth remains before and after humans arrived in North America. If they find viruses or bacteria only in the younger samples, it would support the notion that humans or their domesticated beasts brought diseases that doomed the mammoths. “If they find an exogenous virus,” says retrovirus expert Robin Weiss of University College London, “then I'll sit straight upright.”

  13. DIABETES RESEARCH

    Islet Transplants Not Yet Ready for Prime Time

    1. Todd Zwillich*
    1. Todd Zwillich is a free-lance writer based in Washington, D.C.

    Although recent highly publicized transplantation results are promising, lack of islet tissue will limit the procedure's availability for years

    When researchers in Edmonton, Canada, announced last month that a new procedure for transplanting pancreatic islet cells had freed seven adults with type I diabetes from taking insulin, the news got front-page treatment around the world. Because of the potential therapeutic implications of the work, The New England Journal of Medicine lifted the embargo on a paper detailing the results of the procedure, known as the “Edmonton Protocol,” releasing it on 6 June—more than 7 weeks before its scheduled publication date of 27 July.

    News stories touted the findings as the beginning of the end of life with syringes for the more than 1 million Americans with type I (or “juvenile”) diabetes. And on 13 July, President Bill Clinton took a break from peace talks between Palestinian and Israeli leaders to announce a mostly federally funded $5 million study aimed at replicating the Edmonton team's results that will be carried out at 10 centers in North America and Europe. But impressive as the Edmonton group's achievement was, some important caveats tended to get lost in the public enthusiasm.

    A big drawback is that transplant recipients would need to take immunosuppressive drugs for the rest of their lives to keep from rejecting the tissue. But even if the benefits of the transplants outweigh the risks of the drugs, transplant surgeons—including members of the Edmonton team itself—point to an even bigger problem: There's just not enough islet tissue to go around and there won't be anytime soon. “It's clear that [the Edmonton study] is progress. [But] I think it's years before they have a practical application,” says David Harlan, head of the transplantation and immunity branch at the National Institutes of Health in Bethesda, Maryland.

    Currently, islets are obtained from the pancreases of cadavers, and less than 6000 suitable pancreases become available each year, according to the United Network for Organ Sharing. Because it takes more than one pancreas to provide enough cells for a single transplant, that's not enough to make a dent in the problem. The Juvenile Diabetes Foundation estimates that more than 30,000 new cases of type I diabetes are diagnosed annually. Although efforts are under way to develop new sources of islet cells—growing them in the lab from stem cells and other sources—such cultured cells are still years away from human testing. “Our program now is really limited by our ability to get [islet] tissue,” says Jonathan Lakey of the University of Alberta in Edmonton, one of the surgeons who treated the original seven patients.

    Type I diabetes develops when the beta cells of the islets stop producing insulin, possibly because they are destroyed by an autoimmune attack. Researchers have been trying for years to treat the disease by transplanting islet tissue or the beta cells themselves. Aside from sparing patients the discomfort and inconvenience of daily insulin shots, allowing the body to produce its own insulin via the transplants may provide better control of blood glucose concentrations and might thus lower the risks of blindness, kidney failure, and other possible complications of diabetes.

    Results have generally been dismal, however, mainly because islet cells are extremely fragile. Researchers had a hard time obtaining enough tissue to make a difference, and it often didn't survive transplantation. Even the steroid drugs used for immunosuppression in these earlier efforts tended to kill the beta cells.

    The Edmonton investigators overcame these problems in part by eliminating steroids from their postsurgery treatment protocol in favor of newer, less toxic immunosuppressive drugs. Even so, they needed two donated pancreases to isolate enough functional islets to make each of the transplants successful. One of the original seven patients told Science that he was called in to the operating room and prepped eight times only to be told that not enough functioning islets could be isolated from the single donated pancreases then available.

    The demonstration that islet transplantation can be effective has provided further impetus to efforts to find ways to mass-produce insulin-producing cells in the laboratory. For example, Susan Bonner-Weir's team at the Joslin Diabetes Center at Harvard Medical School in Boston reported in the 5 July issue of the Proceedings of the National Academy of Sciences that they had grown isletlike groups of cells from the usually discarded duct cells of donated human pancreases.

    The cells, dubbed cultivated human islet buds or CHIBs, produce insulin in response to glucose. So far, however, the technique only produces about 30,000 CHIBs from the duct cells of one pancreas, whereas most patients in the Edmonton Protocol received something in the neighborhood of 700,000 islets. “It's a drop in the bucket compared to what you need for Edmonton,” says Bonner-Weir. Her team is working on ways to boost the output of CHIBs, and it has begun testing the cells in mice with an experimental form of diabetes. But Bonner-Weir estimates that it will be at least 3 years before CHIBs are ready for human trials.

    Other investigators have abandoned complex islets and gone right to the insulin-producing beta cells. The cells grow easily in lab culture, but previous efforts to produce them for transplant failed because they stop producing insulin when they are cultured in the lab. Last month, however, molecular geneticist Fred Levine and his colleagues at the University of California, San Diego (UCSD), Cancer Center announced that they had reactivated glucose-responsive insulin production in beta cells grown from immortal cell lines.

    When the researchers transplanted these cells into immunodeficient, diabetic mice, the animals maintained normal blood glucose levels for up to 3 months. Still, the UCSD group has only put its beta cells through about 100 doublings, producing just “a tiny fraction of the numbers we would need for any industrial production,” Levine warns. The researchers are currently trying to boost cell production levels, but even with the most optimistic scenario, Levine adds, human clinical trials are 5 years away.

    Embryonic stem cells offer another promising source of beta cells. Earlier this year, Bernat Soria and his colleagues at the Institute of Bioengineering at Universidad Miguel Hernandez in San Juan, Spain, reported that they had not only produced betalike cells from mouse embryonic stem cells, but that the cells “cured” diabetes for more than 1 year after they were transplanted into genetically altered mice.

    Embryo-derived beta cells seem to produce insulin no matter how many times they have divided in vitro, Soria says. But one drawback—aside from ethical concerns surrounding human embryonic stem cell research—is that the cells may divide so prolifically that they risk giving rise to tumors once they are transplanted. That didn't happen with the Soria team's transplanted mice, but the method is not yet reliable enough to try in humans, he says.

    Ultimately, researchers may be able to solve the problems with their cultured cells and produce enough for all the diabetes patients who need transplants. Until then, says Lakey, diabetics will have to join the 70,000 other Americans who currently sit on waiting lists for organs. The Edmonton Protocol is “a cure like a liver transplant is a cure,” he says. “The patients have pagers and they wait.”

  14. INSTITUTIONAL PROFILE

    Scientific Ballooning's Buoyant Mood

    1. Mark Sincell*
    1. Mark Sincell is a science writer in Houston.

    Morale is high at the National Scientific Balloon Facility, as the crew looks forward to a new era of ultralong-duration balloon flights

    PALESTINE, TEXASIt is almost 7 a.m. For the past 3 hours, a crew of 20 engineers and scientists has been working in the dark, preparing a balloon and its payload for this morning's launch from the National Scientific Balloon Facility (NSBF), the Cape Canaveral of the nation's balloon-borne research program. As the sky starts to glow pink and orange, several metal-sided buildings emerge from the gloom. They look like airplane hangars turned on end, nestled among 1100 hectares of the rolling, pine-covered hills of east central Texas. The five-story-tall control tower, bustle of activity, and faint scent of working machines are all reminiscent of a small-town airport, with one difference: The runway is round.

    The crew has less than an hour to get this balloon in the air. If the sun rises too high in the sky, the payload—a “lobster trap” containing an experiment to calibrate solar power cells—will have to go up another day. After a final check, the team loads the payload and a washing-machine-sized wooden crate containing the balloon onto a truck and then leads a caravan on the 1-kilometer trek from the hangar to the launch pad, a 300-meter-wide disk of asphalt in the center of a grassy clearing. Despite the activity, the scene is surprisingly silent. After almost 2000 balloon launches in the last 30 years, the NSBF team goes about its work with quiet confidence.

    Scientists value that expertise. “They are truly unsung heroes,” says Andrew Lange, a cosmologist at the California Institute of Technology in Pasadena who was the principal investigator for BOOMERANG, a balloon-borne experiment that in December 1998 made the most detailed map ever of the cosmic microwave background (CMB) and in the process discovered dramatic proof that the universe is flat (Science, 28 April, p. 595). Paul Richards, the principal investigator of MAXIMA, a balloon experiment that strikingly confirmed the BOOMERANG results just 6 months later, agrees. “They have done wonders,” says the University of California, Berkeley, cosmologist. The wonders arrive at cut-rate prices. Compared with their high-flying, publicity-grabbing cousins, satellites, balloons are amazingly frugal with time and money. For a small fraction of the cost of a satellite mission, a balloon can loft a payload into the stratosphere for days at a time. And an entire mission can be conceived, designed, flown, and analyzed in a matter of months. A comparable satellite mission could take a decade or more to complete.

    Those advantages made ballooning so popular that by the late 1970s the NSBF was flying upward of 80 missions a year from sites around the globe. For the past decade, however, ballooning has been in a steady decline: The NSBF launched only 20 missions last year. “Most of the easy, cheap science has been done,” explains NSBF Operations Manager Danny Ball. At the same time, Congress has slashed NASA's budget for new balloon payloads.

    But a comeback is in the works. This year's Decadal Review of astronomy has for the first time recommended that NASA increase support for this oft-neglected stepchild of the space program. And in June, the NSBF completed a milestone test flight of its Ultra-Long Duration Balloon (ULDB), a sealed balloon capable of carrying several tons to 37,000 meters for 100 days. If the final ULDB test flight—scheduled for launch in January 2001 from Alice Springs, Australia—is successful, the NSBF expects to start sending up scientific payloads on ULDBs by the end of 2001. “The day has finally arrived when balloons can be competitive with satellites,” says astrophysicist Jonathon Grindlay of Harvard University. “[The ULDB] will do big-impact science at incredibly low cost.”

    Pilgrimage to Palestine

    Palestine, about halfway between Houston and Dallas, is quite literally in the middle of nowhere. It's so remote, it's an ideal place for prisons. A roadside sign along the 10-minute drive from Palestine to the NSBF launch site warns drivers not to pick up hitchhikers: They could be escaped convicts from any of the four prison units that lie within 20 kilometers of the control tower.

    There is good reason for the NSBF's remoteness. A top-of-the-line balloon holds 40 million cubic feet (MCF), or slightly more than a million cubic meters, of helium. “That is about the same volume as the Houston Astrodome,” says Ball. Taller than the Eiffel Tower but as thin-skinned as a plastic bag, these mammoth balloons can lift almost 4 metric tons, about the same weight as three small cars—or one big Texas Cadillac, steer horns and all. In the event of an “unplanned descent”—the NSBF code phrase for a balloon crash—NSBF managers must ensure that no one on the ground is hurt. “So we look for areas with low population density,” Ball says.

    In 1963, when the National Science Foundation moved its 2-year-old balloon program from Boulder, Colorado, you could hardly find a place on Earth as deserted as Palestine. (The program moved to NASA in 1982 and is now administered by the Physical Sciences Lab of New Mexico State University in Las Cruces.) But the population growth around Palestine and the desire for longer flights throughout the year has prompted the NSBF to go global. In addition to the Palestine site, the NSBF now regularly launches balloons from Fort Sumner, New Mexico; Lynn Lake, Manitoba, Canada; McMurdo Station, Antarctica; Fairbanks, Alaska; and Alice Springs, Australia. They have also made special launches from Brazil, Sweden, and the Northwest Territories of Canada.

    Wherever the wind blows

    When Ball arrives at the NSBF launch site at 4:30 a.m., he is already in a hurry. The first stop is the weather room, a spare linoleum and cinderblock affair lined with computers displaying weather maps. Balloonists take no chances with the weather. In the stratosphere, winds are a good thing; without them, balloons would never get anywhere. But when the balloon is on the ground, wind is the enemy. Launches are postponed if wind speeds top 5 knots (2.6 meters per second) at the surface or 12 knots at 150 meters, or if a thunderstorm crosses the flight path. The morning weather report indicates surface winds of only 2 knots. The launch is a go.

    It's crew chief Victor Davison's job to get the balloon into the air. In the predawn dimness, Davison conducts an orchestra of a dozen engineers, two trucks, and one 45-meter-long balloon. Instead of a baton, he holds a tethered pie-ball, a balloon about the size of a child's toy bouncing at the end of a 30-meter string. The air feels dead calm, but to Davison it is alive and kicking. The light breeze pulls the pie-ball to and fro, and Davison moves around until he is satisfied that he knows the exact direction of the wind. Then he gives the order to roll out the empty balloon, which resembles a very long, 60-centimeter-wide red garbage bag.

    Today's launch is unusual because the payload rides on top of the balloon to keep its electronic eye on the sun; only the communications package will hang below. This configuration requires a small extra balloon to levitate the experiment before the main balloon is inflated underneath.

    Everything looks ready, so Davison hops into the launch truck—a converted tractor-trailer to which the bottom of the main balloon is clamped—and gives the signal to start pumping. There is a sharp hiss of escaping high-pressure helium, and almost instantaneously the small balloon, a nearly transparent milky-white bubble, emerges from the tangle of hoses and bags strewn on the ground. Once the small balloon is full and the experiment is hovering a meter or two off the ground, an ear-splitting whine rends the heavy morning air as technicians start to inflate the main balloon. Ten minutes later, the clamp holding the top end of the balloon on the ground snaps open and the fully inflated balloon leaps into the air over the launch truck. Davison guides the truck directly under the balloon and lets the payload fly. The balloon rises into the sky, dangling its payload like the tendrils of a jellyfish.

    For the next half-hour, there is nothing to do but wait. Most of the crew heads for the control tower, a dusty, cluttered outpost standing five floors above the hangars where a dozen early-rising scientists have just begun the day's work of preparing for their own flights later this week. The control room is ringed with windows, but almost everyone's eyes are glued to a battery of computers that monitor the balloon's climb to 37,000 meters. Everyone except Bruce Anspaugh, that is. The Jet Propulsion Lab physicist who is the principal investigator on today's mission stays outside to watch his slowly ascending experiment. Even after almost 3 decades of yearly launches, he never gets tired of balloon flights.

    Cheaper and faster

    It is no coincidence that Anspaugh, like many of his colleagues, started ballooning as a graduate student. From its inception, the NSBF was intended to be “a training bed for scientists,” says Philip Lubin, an astrophysicist at the University of California, Santa Barbara, who observed the CMB from balloons before getting involved with the Cosmic Background Explorer satellite program. Because graduate students are notoriously short of cash and impatient to finish their theses, balloons have to be inexpensive and the experiments able to fly on short notice.

    They succeed admirably on both counts. The small 3-MCF balloon that Anspaugh is flying today cost a mere $30,000, and a giant 40-MCF balloon will set you back only $120,000. That's very small potatoes compared with a satellite's multimillion-dollar price tag. Add to that the fact that satellite experiments take decades to develop, and ballooning becomes an even better bargain. “An instrument built last month in the laboratory can be flying on a balloon this month,” Lubin says. In fact, because of the rapid turnaround and ability to reach near-space conditions, prototypes of virtually all satellite experiments are first tested on balloons. “You would have to be foolish to fly a new detector on a satellite without first testing it under the conditions of a balloon experiment,” says Jerry Fishman, an astrophysicist at Goddard Space Flight Center in Greenbelt, Maryland, who led the Science Working Group of the Compton Gamma-Ray Observatory.

    Balloons do have their drawbacks, though. Once a satellite reaches orbit, it can stay there almost indefinitely. Few balloon flights, on the other hand, last more than 3 weeks. In the NSBF's so-called zero-pressure balloons, an opening in the bottom releases gas to keep the inside of the balloon at the same pressure as the surrounding atmosphere. When the air around the balloon cools at night, the slightly deflated balloon sinks, or droops, deeper into the atmosphere. To regain the lost altitude, scientists lighten the payload by dropping ballast; when the ballast is gone, the mission is over.

    Droop doesn't matter much to a scientist like Anspaugh, who can gather all the data he needs in an hour. But for astrophysicists who want to collect rare cosmic rays, make images of faint newborn infrared galaxies at the edge of the universe, or catch the gleam of an extrasolar planet, short missions won't do the trick. The lost hours spent at lower than optimal altitudes “really hit your science,” says Harvard's Grindlay. But there is a way to stop the droop: seal the balloon. And that is exactly what NASA and the NSBF are doing with their superpressure ULDB balloon program.

    The pumpkin-shaped ULDB balloons are about the same size as a large zero-pressure balloon. Because all the buoyant gases are sealed inside, the superpressure balloons being developed by Goddard at the Wallops Flight Facility in Wallops Island, Virginia, are impervious to the diurnal temperature changes that cause the droop that eventually grounds zero-pressure balloons. A ULDB should be able to stay at a constant altitude for at least 100 days. To maintain constant contact with a ULDB on its globetrotting journey, the NSBF will link these balloons to the constellation of three satellites that forms the Tracking and Data Relay Satellite System. The high-speed data link permits scientists to gather data from and send commands directly to the balloon over the Internet. “Now the scientists can just sit at home and watch the data flow in,” says ULDB project manager Steve Smith of Wallops.

    For Anspaugh, such high-tech frills are a needless luxury. Six hours after takeoff, his solar-cell experiment has parachuted safely into the west Texas desert, and the NSBF chase plane has flown him out to bring it back home. The crew seems buoyant, not just about one more successful mission, but about NSBF's future, as it prepares to open a new chapter in its history. As Grindlay says, “Ballooning is on a sharply upward trajectory.”

  15. TOXICOGENOMICS

    Toxicologists Brace for Genomics Revolution

    1. Richard A. Lovett*
    1. Richard A. Lovett is a writer in Portland, Oregon.

    Gene-array technology promises to deliver comprehensive profiles of toxic compounds, but validation will take years

    ASPEN, COLORADOMillions of animals are raised in the United States each year for routine toxicology tests, exposed to compounds in food additives, cosmetics, and industrial products, and then studied for ill effects. This is a time-honored way of identifying human health risks, but it can be an imprecise science. It's also expensive and increasingly under attack by animal-rights activists as wasteful. Now, according to researchers who gathered at a high-powered summit this month, http://www.sciencemag.org/cgi/content/full/289/5479/536 toxicology may be on the verge of changing the way it collects raw data—adopting a process that could reduce animal use and improve test results.

    The new approach, called “toxicogenomics,” grows out of the human genome project. Rather than using animal pathology to identify illnesses, it probes human or animal genetic material printed on plates, called DNA arrays. Cancer researchers have already been using such arrays for several years to compare gene expression in healthy and diseased cells (Science, 15 October 1999, p. 444). Toxicologists are using the same technology to profile gene expression in cells exposed to test compounds.

    The advantages of these DNA tests are legion: They are fast, efficient, and reduce live-animal expenses, which can range as high as $3000 per week, per animal, when nonhuman primates are used. Some of the biggest gains may come in cancer toxicology: New tests may be able to spot the metabolic precursors of slow-developing diseases without holding up research for the months or years it takes for tumors to develop. If adapted for use in tissue cultures, these tests might even eliminate the need to sacrifice animals.

    Toxicogenomics is advancing so rapidly as a specialty that the National Institute of Environmental Health Sciences (NIEHS) this spring opened a new National Center for Toxicogenomics in Research Triangle Park, North Carolina. Its express purpose is to spur the development of gene-based toxicity studies. But some leaders in the field warn against rushing too quickly to embrace DNA tests, which are still difficult to interpret. Doing so, they say, could exaggerate some risks and understate others—halting research on promising new products while overlooking life-threatening toxicities that would have shown up on traditional bioassays. “We have to be careful we don't drive beyond our headlights and run into a wall,” cautions Joseph DeGeorge, a pharmacologist at the U.S. Food and Drug Administration's (FDA's) Center for Drug Evaluation and Research.

    Mountains of data

    The basics of toxicogenomics are straightforward, although details vary from lab to lab. The hardware uses gene arrays bearing such names as “ToxChip” or “ToxBlot” that contain thousands of genes that might be affected by toxic chemicals. These genes, arranged on plastic or glass plates about the size of microscope slides, bind to matching genetic material extracted from animals or cell cultures exposed to the substance being tested. The extracted genetic material, called messenger RNA, comes only from genes that are currently active; it is reverse transcribed and tagged with a radioisotope or a fluorescent marker to simplify detection. Researchers sometimes use a red marker for material from treated cells and a green one for untreated controls. When labeled sequences are tested on a single array, both treated and untreated types bind to a gene site, with the resulting color at each site showing the degree to which that gene has been turned on or off by the putative toxicant.

    The great promise of toxicogenomics is that it might be used to scan the entire human genome to see which genes are affected by specific chemicals. “Right now, that's not feasible,” in part because not all the genes can be placed on arrays, says Richard S. Paules, toxicogenomics facilitator at NIEHS, “but at some point it may be.”

    The immediate goal, Paules says, is to look at different classes of compounds and identify groups of genes that are tightly correlated with known classes of toxicants. These “very informative” genes could then be used to generate a next-generation array with a small number of genes. The condensed set could be used routinely to determine if a test chemical exhibits any of several common toxicities.

    After a critical gene set is identified, the real work of sifting wheat from chaff begins. DNA arrays generate mountains of data. A single experiment, Paules says, can produce 300,000 data points; computer pattern-recognition programs must be used to tease out the meaning—a job that researchers describe as “mind-bogglingly complex.” But Paules doesn't see it as impossible. Toxicogenomics is just in its infancy, he says, like clinical pathology before doctors learned how to recognize biopsy samples as benign or cancerous: “It takes years of experience” to make such distinctions. Once the appropriate databases are assembled, toxicogenomics will become something like digitized pathology. “It's like taking the field of pathology from the Wright brothers to the moon, from a subjective art to measuring thousands of parameters,” Paules says. “But to do that, you need a very robust database.”

    The biggest challenge will be interpreting the results of these analyses. Simply observing that a chemical changes a cell's gene expression is meaningless: Virtually any change in the environment will do that. The body makes complex cellular-level adjustments, for example, just to cope with waking up in the morning or moving to a higher elevation. One of toxicologists' greatest fears is that people with antichemical axes to grind will obtain gene-array results and create public hysteria. For that reason, toxicologists agree that it's not enough simply to compare a test chemical's effects on gene expression to those of known toxicants. It's also necessary to validate the entire process by correlating such changes to actual illnesses.

    It may be years before gene arrays are widely accepted for routine chemical screening. But they're already being used in research, where they help identify biochemical pathways that are vulnerable to chemical interference. In the past, toxicologists learned to identify such pathways by becoming expert in each step in the process—a method that Bill Pennie, head of investigative toxicology at Zeneca Central Toxicology Laboratory in Macclesfield, U.K., refers to as the “one gene, one postdoc approach.” It sometimes took an army to do the job. Gene arrays are streamlining this process by revealing which genes are affected by various categories of chemicals. Researchers can then turn to conventional techniques for a detailed analysis.

    Scientists are now using this mixed strategy to probe a class of chemicals known as peroxisome proliferators, which includes certain herbicides, pharmaceuticals, and plasticizers. Animal bioassays reveal that these compounds cause liver tumors in rodents, but by a mechanism that most toxicologists believe isn't relevant for humans. Not surprisingly, researchers would like to know more about how these chemicals affect liver metabolism.

    NIEHS has run several of these compounds through its own gene-array process, called ToxChip. The research is preliminary, but it is revealing that many of the affected genes have already been identified in the toxicological literature—a useful validation of the test. Just as important, however, is the discovery that about half the genes identified by ToxChip weren't previously known to be involved in peroxisome proliferation.

    Pennie's research team has been making similar use of gene arrays to study the mechanisms by which estrogenlike chemicals affect organs as disparate as the brain, uterus, ovaries, and testes. It's not yet a precise technique—Pennie calls it “stamp collecting”—but the gene data help researchers generate hypotheses about how biochemical pathways work—hypotheses that they can then test in gene-altered mice.

    A coming boom?

    These near-term uses are not what thrill toxicogenomics fans, though. Their Holy Grail is to develop routine gene-array screens that can be used to catalog the risks of previously untested chemicals. And despite traditionalists' concerns, this dream is drawing near.

    Drug companies are among the most enthusiastic about the vision, because they're interested in finding ways to speed the process of toxicological testing to keep pace with new R&D techniques that have vastly increased the rate at which candidate drugs are being developed. The industry would like to weed out potentially dangerous ones early in the expensive development process, says David Essayan, an assistant professor of medicine at Johns Hopkins University. Health officials also would like to find tests that can reduce toxic drug reactions, estimated to cost the nation $77 billion a year, Essayan says.

    Support for toxicogenomics may flourish outside the technical community, too. Penelope Fenner-Crisp, senior science adviser to the director of the U.S. Environmental Protection Agency's Office of Pesticide Programs, says animal-rights advocates like the technology because they see it as a way to pursue the “three R's” of conscientious animal research: replacement, refinement, and reduction. “There will be pressure to use these technologies, probably sooner than the scientists think they're ready,” she predicts. Already, she adds, European countries have imposed legislative constraints on animal research, and similar U.S. proposals are always “hovering” in the background.

    Given these pressures, toxicogenomics isn't going to wait demurely in the wings while scientists validate it, says FDA's DeGeorge. Fenner-Crisp agrees, advising scientists to expect to be pushed to use these tests sooner than they would like. And, she adds wryly: “We'll probably be among those who press you.”

    Jay Goodman, a professor of pharmacology and toxicology at Michigan State University in East Lansing, however, urges scientists not to be coerced into using the new techniques before they've been properly “anchored” by comparison to known toxicological responses. Whatever the pressures from animal-rights organizations, he says, “we need to be true to the science and let this sort itself out in the peer-reviewed literature.

    • *26th Annual Summer Meeting, The Toxicology Forum, 10 to 14 July.

  16. The Shots Heard 'Round the World

    1. Eliot Marshall*
    1. With reporting by Laura Helmuth.

    The massacre at Columbine High School last year unleashed a torrent of fresh concern over the threat that violence poses to society. It also energized a government research effort to understand and prevent violence

    On a cool spring morning in April 1999, two forlorn teenage boys smuggled weapons into Columbine High School in Littleton, Colorado, and began a methodical murder spree, killing 12 of their peers, a teacher, and themselves. The massacre left deep wounds in the community and touched a national nerve. It also changed the lives of many scientists.

    “Littleton was the wake-up call,” says Steven Hyman, director of the National Institute of Mental Health (NIMH). Government-funded programs that had been working in relative obscurity on the causes of violence suddenly were in the spotlight. Researchers, however, had few immediate answers to offer a public desperate to know what demons were tormenting middle-class white children in the suburbs. “It got the attention of Congress,” says Hyman. The groundswell of concern, he says, persuaded influential leaders such as Senator Arlen Specter (R-PA), a member of the appropriations committee, “to see youth violence as a public health problem.”

    Specter earmarked about $900 million in this year's budget of the Department of Health and Human Services (HHS), which oversees NIMH's parent agency—the National Institutes of Health (NIH)—for programs to study and combat youth violence. He hopes for more in 2001: The Senate appropriations report proposes $1.2 billion for a “youth violence prevention initiative,” much of it to reorient existing education and public health programs. The report also asks NIH to step up support for behavioral studies of children and adolescents who are at risk for becoming violent (see p. 580). Among other things, the Senate report urges NIMH to develop better ways of preventing child neglect, treating attention deficit disorder, combating depression and suicidal thinking, and evaluating models of a tough social education process called “therapeutic foster care,” an alternative to jail for some delinquents. The funding bill passed the Senate last month but isn't likely to clear Congress until late fall.

    Surgeon General David Satcher, meanwhile, is preparing a major report on strategies for treating violence as a disease—an approach he has long advocated. It's due this fall. President Clinton has launched an independent, nonprofit outfit—the National Campaign Against Youth Violence—to create Web sites, videos, and other media products in a wishful effort to change attitudes on violence in 13 target cities. He also set up a White House National Council on Youth Violence Prevention nominally made up of Cabinet members; it's been meeting monthly since January.

    Ironically, this flurry of activity comes at a time when youth violence, as reflected in crime statistics, is in decline. The trend has been downward in the United States since 1993, the all-time peak for homicides committed by young men, according to criminologist Alfred Blumstein of Carnegie Mellon University in Pittsburgh. A surge of homicides between 1985 and 1993—fueled, Blumstein says, by drug-related violence chiefly involving minorities in big cities—provoked a strong response. Elected officials adopted tougher drug laws, longer prison sentences, and boot camps for delinquents— a series of get-tough moves often credited with reducing the crime rate. Social scientists, however, remain skeptical that these policies in isolation made the difference (see p. 582).

    Hand in hand with the crackdown came a boom in government-funded research on crime. Congress in 1994 gave the National Science Foundation $12 million to start an antiviolence research center. The foundation plowed the money into a “virtual” institute of electronically linked experts called the National Consortium on Violence Research, which has access to a unique U.S. Census database on crime victims. The center coordinates studies on crime and aggression—for example, observing the social development of delinquents and criminals—with an eye to improving policies. The consortium, headed by

    Roller-coaster ride.

    Recent U.S. school shootings belie the fact that homicide rates are falling.

    SOURCE: FBI

    Blumstein, is now up for a 5-year renewal. The interest in violence also boosted preexisting research programs at the Centers for Disease Control and Prevention (CDC) in Atlanta and at the National Institute of Justice.

    Eager to contribute to this new antiviolence agenda, research chiefs at NIH are also paying attention to environmental and behavioral studies—and talking less about biochemistry. Hyman, for example, distances himself from the policies of an earlier administration, which was fascinated by inherited traits that contribute to criminal behavior. Former director of the Alcohol, Drug Abuse, and Mental Health Administration, Frederick Goodwin, supported work on the biochemical markers of aggressive behavior in animals, such as low levels of serotonin in spinal fluid. A 1992 federal initiative on violence got entangled both in this sort of research and in an NIH-funded conference on the “genetics of crime.” Attacked on ideological grounds, the conference was first canceled, then reinstated, and finally held in 1995. But the HHS antiviolence initiative itself petered out (see sidebar).

    Today, Hyman says, “we no longer take seriously the rather impoverished and deterministic models” that assume “the level of serotonin in your cerebrospinal fluid predicts whether you are going to kill yourself or someone else.” Instead, his institute has moved to less controversial ground, adopting the view that violent behavior can best be understood—and prevented—if it is attacked as if it were a contagious disease that flourishes in vulnerable individuals and resource-poor neighborhoods. Scientists continue to probe the biological sources of violence in humans and in animals, however, arguing that valid lines of research have emerged from a field whose past is haunted by charlatans (p. 575). But to Hyman, the decisive public shift in thinking about violence—emphasizing social rather than biological factors—is “very, very healthy.”

    Taking stock

    Because NIMH still funds the largest portfolio of basic research on violence, about $40 million per year, it has assigned itself the task of reviewing practical measures—or “interventions,” according to the jargon—that agencies might try out to divert young people from criminal behavior. NIMH staffer Della Hann has labored for more than a year, Hyman says, “taking stock” of what's been funded in the past and “sifting the wheat from the chaff to see what we actually know.” The review examined decades' worth of NIMH-funded studies on risk factors that appear to contribute to aggressive behavior, such as neglect and abuse in childhood, harsh and inconsistent discipline, and associating with antisocial peers.

    Hyman believes NIMH may have invested enough in identifying such risks. “Sometimes you get a cottage industry in doing certain kinds of research,” he says, going repeatedly over the same ground. Now it is time to move on, he suggests, and apply this research to experiments in the real world. Assuming that youth violence can be treated as an illness, he says, “we want to try to target the risk factors, try to develop interventions, and make generalizations” about what's been learned. “We would do small efficacy trials first, then large effectiveness trials.” Hann says NIMH staffers have sorted through more than 200 research projects, selecting the most promising for discussion in a report to be finished this month.

    A few themes emerged from this retrospective analysis during a meeting of experts organized last October by the NIH director's Office of Behavioral and Social Sciences Research. One important message, according to NIMH official Farris Tuma, is that some well-meaning programs designed to control aggressive children may be worse than useless: They may be doing harm. For example, collecting young people in group homes or sending them to boot camps or on wilderness ordeals—popular in many states—may intensify rather than reverse antisocial behavior.

    CREDIT: RICK KOZAK

    Clinical psychologist Thomas Dishion and colleagues at the Oregon Social Learning Center in Eugene, Oregon, for example, studied the development of a group of about 200 boys over 5 years. They found a consistent pattern: Rule-breaking children trained others in misconduct. Boys who did not smoke tobacco or marijuana or abuse alcohol before age 13 or 14, but who became friends of boys who did, advanced in a statistically predictable way to become substance abusers 2 years later. The researchers found that this “deviancy training” produced boys at ages 14 to 16 who admitted to acts of delinquency. The same process, they argue, molds criminals and violent adults. Dishion's group believes it's a terrible mistake to house young delinquents together. Using a medical term for hospital-borne disease, they claim this practice has the “iatrogenic” effect of magnifying the problem.

    Studies by criminologist Delbert Elliott and colleagues at the University of Colorado, Boulder, also support this conclusion. By comparing the development of delinquents and nondelinquents, Elliott's group determined that the final step in a path to “serious offending” was invariably the same—joining an antisocial peer group. This may sound obvious, but it's important to have it documented, Hyman says, for it indicates that government-supported programs are heading in precisely the wrong direction. By aggregating delinquents in group homes or sending them through the adult justice system—as many localities are doing—Hyman maintains, “we are sending them to graduate school for violence and delinquency.”

    Searching for better ideas, the program review groups at NIH and elsewhere have identified strategies that have been tested and might have a better chance of preventing violent behavior. NIMH staffers point to work by Elliott called “Blueprints for Violence Prevention,” funded by the state, the CDC, and the Department of Justice. To cull the best from a large crop of local programs, Elliott's group established some rigorous screening criteria. They insisted that to qualify as a success, a program had to be designed as a true experiment with clear outcomes, produce quantifiable benefits, exhibit long-term improvements among its graduates, and its good results had to be replicated at more than one site. A staffer says more than 500 programs have been screened through this sieve; Elliott's team identified 10 proven winners (www.colorado.edu/cspv/blueprints/Default.htm).

    Interventions can't start too early, apparently. In one program, nurses make in-home visits to pregnant women and return periodically until the child is 2 years old. Fifteen years later, those children seem less likely to abuse drugs or commit crimes. A bullying-prevention program focuses on elementary schools and attempts to change the social climate so that intimidation is a less acceptable way to solve problems, says Elliott. And an intriguing experiment designed to keep adolescent delinquents from becoming criminals evolved under the direction of Patricia Chamberlain at the Oregon Social Learning Center. The approach is simple, but demanding. The researchers recruit and pay foster families to accept a delinquent child into their homes for a period of about 7 months. The delinquent, rescued temporarily from jail or confinement to a group home, must follow strict rules. The parents are drilled in rule- enforcement and provided with access to round-the-clock professional support.

    Chamberlain says her group began working on what she calls “multidimensional treatment foster care”—a form of closely supervised parenting with professional backup—in the early 1980s. In 1990 her group received funding from NIMH to conduct a clinical trial. Delinquent boys (85% white and about 14 years old) were randomly assigned either to therapeutic foster care or to regular group homes. The results after the first year: Boys in foster care were far less likely to run away than those in group homes (30.5% versus 57.8%). Most important, boys in foster care were less likely to get into trouble, spending 60% fewer days behind bars in the year after treatment. Creating a network of foster families to treat one delinquent per household is difficult. But an economic analysis of the NIMH-funded study by the Washington State Institute for Public Policy in Olympia, Washington, found in 1999 that therapeutic foster care, costing $1934 per delinquent, is a bargain. The economists calculated that standard law enforcement costs an additional $27,000 per delinquent, mainly because boys held in juvenile detention are more likely to go to prison.

    To expand the search for innovative programs, NIMH and three other NIH institutes are putting up $3 million this year. Their January 2000 request for applications offers to support about a dozen new projects on youth violence at $200,000 each, for up to 3 years. But NIH doesn't want more studies on risk factors, noting that they already make up two-thirds of the current portfolio. Instead, it wants well-designed social experiments. “Behavioral interventions should be thought of just like drugs,” Hyman says: “It's easy to fool yourself about efficacy if you haven't done a proper clinical trial.”

    Biomedicine's role

    The emphasis on using behavior-altering techniques for blunting violence, according to NIMH staffers, should not be seen as a rejection of biology. It's just a matter of what's ready for prime time, says Hyman. “We continue to fund long-term research … on the circuits in the brain involved in all emotion.” But this work on neural circuitry is “still in its early stages,” he says: “Perhaps the most interesting question is how context interacts with the human brain to unleash different types of violence. After all, no airplane dropped DNA on the former Yugoslavia or Rwanda to unleash genocidal impulses.” But he doesn't think it makes sense to promote studies of hormones such as serotonin—which he calls “bodily fluids research”—because such approaches have revealed “nothing about cause or mechanism” and have been “relatively sterile with regard to generating good follow-up questions.”

    It's easy to exaggerate the meaning of biological findings, Hyman asserts. As an example he points to studies of the prefrontal lobe, such as work by Adrian Raine of the University of Southern California in Los Angeles, that have associated decreased gray matter or injury with violent behavior. The research is “very well done,” Hyman says. But he also thinks that some people, including Raine himself, read too much into the findings.

    A recent commentary by Raine on the brain scans of a group of California murderers ran under the provocative headline: “Can We See the Mark of Cain?” To Hyman, this is reminiscent of phrenology, the 19th century practice of analyzing bumps on the skull to identify personality types. Hyman thinks that predicting violence on the basis of a brain scan is “vanishingly unlikely.” Raine acknowledges that his work has run into criticism, which he considers mainly “political.” But he insists that “more and more scientific evidence” demonstrates “a clear biological and genetic basis to crime and violence.” Says Raine, “If you stick to the facts, science will win the day.”

    Scans may be useful, Hyman concedes, as a “first-cut” approach to finding links between risky behavior and physiological problems. Studies have shown that some children who get in trouble experienced head injuries early in life. Doing brain scans might make sense if it were possible to offer them specific therapy. But Hyman suggests that, as a leader in policy discussions on violence, NIMH should advance interventions that have been tested and proved useful.

    But even some widely accepted therapies remain controversial. Children who fidget or disrupt class—if diagnosed with attention deficit hyperactivity disorder (ADHD)—are often prescribed Ritalin, a stimulant, to help them concentrate. However, a recent study of the widespread use of stimulants and other psychoactive drugs in young children has been described by Harvard psychiatrist Joseph Coyle as “troubling.”

    “Ritalin is both underused and overused,” is Hyman's diplomatic view. Clinical trials have proved Ritalin safe and effective for treating ADHD—even “superior to purely behavioral interventions,” says Hyman. Yet minority children don't get Ritalin when they most need it, in part because of negative publicity. At the same time, it is prescribed at higher rates than expected, considering the occurrence of ADHD (3% to 5%). “Kids who are not treated for attention deficit hyperactivity disorder,” he says, “are known to have a higher risk of drug abuse and also of offending and getting involved with the criminal justice system.” NIMH may try to redress the imbalance by offering straight information and funding to inner city researchers.

    Indeed, psychiatrist Carl Bell of the Community Mental Health Council in Chicago says that the “antipsychiatry” movement, including the Church of Scientology, has plastered his neighborhood with “slick brochures” about a “genocidal plot” to drug minority children, and this has scared people. Bell, who is black himself, says he would welcome federal help in running clinical trials of Ritalin and other drugs that might help his patients. He argues that only a clinic like his own, which is controlled by a community board, will be trusted to carry out research of this kind.

    NIMH also wants to intensify its focus on quiet children whose troubled behavior may be important to identify early. “A lot of attention has been paid” to agitated and disruptive children, Tuma says, “but not enough to the withdrawn, shy child.” Many of the teenagers involved in recent school shootings “were not running around calling attention to themselves,” Hyman notes. They were “sullen, withdrawn, alienated, and potentially depressed.” Indeed, the older of the two Littleton shooters, investigators later discovered, had been taking a prescription drug for depression, Luvox, and had threatened peers long before the shootings. NIMH is investing $15 million in a clinical trial of adolescent depression coordinated by Duke University, Hyman says, to test the efficacy of Prozac, cognitive behavior therapy, and a combination of both versus a placebo. Ten centers have signed up since 1998 to enroll 430 children between the ages of 12 and 17; results could appear as soon as 2004. Hyman sees it as an important first step: “We need to establish the safety and efficacy of treatments for depression among school kids and adolescents,” a topic that hasn't been adequately studied.

    After Littleton, says Hyman, self-styled experts came out with all kinds of do-it-yourself “checklists” for identifying potential killers in the classroom. He sees the trend as dangerous, mainly because it could stigmatize vulnerable children—most of whom are not inclined to violence—and make them more resentful and withdrawn. Researchers can contribute to violence prevention initiatives, Hyman says, by disseminating scientifically vetted information to teachers, counselors, and the public. It's critical, he says, to help people identify good therapies, and to avoid personality checklists and the “gulag point of view”—and, for the moment, to push the genetics of violence offstage.

  17. A Sinister Plot or Victim of Politics?

    1. Eliot Marshall

    When the Bush Administration proposed a “violence initiative” 8 years ago, an innocent bystander—David Wasserman—suddenly found himself at ground zero in a battle over racism and psychiatry. Wasserman, a policy researcher at the University of Maryland, Baltimore, says he was taken by surprise when his proposal to host a meeting on the genetics of crime sparked an explosion of antigovernment lobbying. He was on his honeymoon when activists began hurling accusations that his conference was part of a government conspiracy to dope up inner city kids on psychoactive drugs.

    The accusations were keyed to a policy then being formulated by Louis Sullivan, Bush's secretary of Health and Human Services (HHS). In 1992, Sullivan wanted to launch a public health initiative to reduce urban violence. At the time, the per capita homicide rate in the United States was climbing to its highest level on record. But activists led by Peter Breggin, an independent psychiatrist in Bethesda, Maryland, saw a sinister purpose in Sullivan's plan. Citing biochemical research on aggression funded by the National Institutes of Health (NIH), Breggin claimed that NIH-funded psychiatrists were hatching a scheme to pacify inner city children with drugs such as Prozac and Ritalin (Science, 9 October 1992, p. 212). Breggin still pushes this message in a series of books attacking drug therapy.

    It didn't help when the director of the Alcohol, Drug Abuse, and Mental Health Administration, Frederick Goodwin, made some remarks in 1992 about the “urban jungle,” comparing aggressive behavior by humans and monkeys. Members of the Congressional Black Caucus took offense. In the resulting furor, the Bush Administration scuttled the conference on crime.

    Three years later, however, Wasserman got to hold his meeting. “I was ready to quit,” he recalls, but the university considered the funding cancellation—ordered by then-NIH director Bernadine Healy—as “irresponsible,” he says, and a breach of academic freedom. The university appealed, won, and got a bigger grant from NIH ($133,000). In September 1995, Wasserman went ahead with the planned session on “The Meaning and Significance of Research on Genetics and Criminal Behavior.” Protesters, invited by some sympathetic participants, showed up and invaded the hall (Science, 29 September 1995, p. 1808).

    During the free-for-all, attendees exhibited some real aggression, Wasserman says. One invited researcher shoved a protester. At that point, a neurogenetics researcher from NIH stepped in and, acting as a prosecutor in an ad hoc trial, charged his fellow scientist with misbehavior. Wasserman, serving as judge, ruled that the accused would have to leave the conference. The meeting then continued without ado. Since then, Wasserman has edited the commissioned papers, and Cambridge University Press plans to publish them this year.

    Sullivan's proposed violence initiative didn't fare so well. Susan Solomon, acting director of NIH's Office of Behavioral and Social Sciences Research, confirms that it simply died with the passing of the Bush Administration. A postmortem conducted in 1993 and 1994 by an HHS “blue ribbon” panel had several unremarkable suggestions, including advising the department to be more “culturally sensitive.”

    HHS adopted a few recommendations and ignored others. But it took one proposal to heart: to increase public funding of violence research. Since 1992, funding has grown significantly, although federal officials now stress the importance of social factors that contribute to violence and utter hardly a peep about genetics.

  18. In Europe, Hooligans Are Prime Subjects for Research

    1. Michael Hagmann

    CAMBRIDGE, U.K.—When fans of two of the largest Dutch soccer teams (or football clubs, as they're called in Europe) met on 23 March 1997, it seemed at first like business as usual: a great many flying fists, a few cars set ablaze. Then a fan stabbed a supporter of a rival club to death. The public responded with shock to the first football casualty on Dutch soil, but for behavioral biologist Otto Adang, the murder, though tragic, wasn't unexpected. “This was bound to happen sooner or later,” he says.

    Adang has studied football hooliganism and riot behavior since the mid-1980s, using observational methods that he developed when studying aggressive behavior in the Arnhem Zoo's chimp colony. In what Menno Kruk, a behavioral neurobiologist at Leiden University, calls “the first big field study on hooliganism,” Adang says he “traveled around in the Netherlands for 3 ½ years looking for trouble.” All told, he attended 225 football matches and protest demonstrations, recording the behavior of protesters and hooligans on more than 700 hours of audio tape. Such a systematic approach, says Adang, allowed him “to compare various situations … to answer why some of them turned into riots and others didn't.” The 1997 stabbing death, he says, was foreseeable because not only had football hooliganism become more violent, but it had also moved away from the football grounds, making it harder for the police to anticipate where clashes were going to erupt.

    Research into football hooliganism is one of the few burgeoning areas of violence research in Europe. The University of Leicester in the United Kingdom even established an entire research unit—the Sir Norman Chester Centre for Football Research—on the topic in 1987 (although the center has broadened its scope since). In contrast, other types of violence studies are languishing here, researchers say. With lower homicide rates than in the United States and fewer incidences of killing sprees such as the Littleton school shooting, Europeans are less concerned about violence than Americans are—and that translates into less money for research on the topic. Moreover, some scientists argue that strict regulation of animal studies has dealt a severe blow to a once-proud European tradition of behavioral research on animal aggression.

    “The number one fear of Americans is becoming the victim of a violent crime,” notes Terrie Moffitt, an American expatriate developmental psychologist at King's College in London. That's something Europeans worry about much less, even though international surveys from some 20 different countries show that “the probability of being assaulted is not that different in the U.S. and [Great Britain],” Moffitt says. The big difference is in the outcome: “The homicide rate is indeed quite a bit higher in the U.S.,” Moffitt notes. Says Patricia Brennan of Emory University in Atlanta: “If Europe had the same number of guns, they probably would have the same number of homicides.”

    Although they don't envy Americans their homicide rates and Littleton headlines, European violence researchers can't help but envy the attention and funding such tragedies bring. “In the U.S. it's possible to get grants in the million-dollar range to do big science,” from agencies such as the National Institute of Mental Health (NIMH), says Kruk; private U.S. foundations also make significant contributions. In most European countries, however, there is no equivalent of NIMH, and philanthropic support for violence research is virtually nonexistent. As a result, European violence projects usually “get a grant of maybe $20,000,” says Kruk.

    With funds so scarce, “a lot of violence researchers moved on to different fields such as depression, anxiety disorders, and such,” says German-born psychopharmacologist Klaus Miczek of Tufts University in Medford, Massachusetts. The consequences of this exodus are plain to see. The attendees at a recent meeting in Britain on severe antisocial personality disorders “all fitted in one small room,” says Moffitt. “At any given American Society of Criminology meeting you'll have about 5000 people—and that doesn't even include the psychiatrists.”

    That hasn't always been the case. In the 1960s and '70s, says Miczek, “the Netherlands and Britain were the mecca of aggression research [in animals]. The Europeans really were trendsetters, introducing new and quantitative ethological methods” to study animal behavior, following in the footsteps of founding fathers Niko Tinbergen and Konrad Lorenz and their disciples. But more recently, aggression research has met with severe criticism from animal rights activists, who object to experiments in which animals attack and injure one another.

    “The animal rights movement entirely shut off aggression research in animals” in the United Kingdom and other parts of Europe, says Miczek. The Max Planck Institute for Psychiatry in Munich, for example, once ran a large program to decipher the neuronal circuitry behind aggressive behavior using tiny electrodes to stimulate various brain areas in primates. “This is all but gone now. I don't know of anybody who's still doing [this kind of] primate aggression research,” says Miczek.

    A few areas of research other than hooliganism are managing to remain relatively strong here, however. Among them are longitudinal studies, in which groups of people are followed from early childhood into adulthood to pinpoint factors that might help foster violent behavior. Such studies originated in Scandinavia, where governments keep a wealth of social records, making it an El Dorado for this kind of research. The findings often are picked up by American researchers, who have a tendency to “want to fix the problem and come up with prevention programs,” says Kruk.

    Importing this philosophy is Adang, who is now pursuing his research at the Dutch Police Academy in Apeldoorn. With funds from a European Union grant, Adang sent observers to every city that hosted matches during EURO 2000, the European football championship that took place in the Netherlands and Belgium last month. “There's a lot of opinions but very few hard facts about the value of preventive police tactics. We wanted to use [EURO 2000] as a street lab of sorts to test some of our hypotheses, predictions, and recommendations based on earlier studies.” One of those recommendations was that police should mingle with the hooligans and talk to them—“really anything that destroys their anonymity,” says Adang. So far, he's cautiously optimistic: “The number of serious incidents was very small. It seems to work.” -MICHAEL HAGMANN

  19. Searching for the Mark of Cain

    1. Martin Enserink

    Hampered by political, ethical, and methodological problems, a small group of researchers is trying to understand the biological roots of violence—with the ultimate hope of finding a treatment

    MEDFORD, MASSACHUSETTSThere's a cocktail lounge in this Boston suburb where the customers are encouraged to overindulge. It's a place where mice get tipsy. A thirsty mouse pokes its nose through a hole, where a sensor takes its order, then it scampers over to a spout a few centimeters away, which immediately dispenses a few drops of an alcohol solution roughly the strength of beer. Mice don't know when to stop. Invariably they scoot back to trip the sensor for another round, and so on, until a computer program decides the bar is closed—by which time the rodent has ingested the equivalent of one or two human drinks.

    Klaus Miczek, a psychopharmacologist at Tufts University here, wants to know why this binge drinking turns about one in four mice extremely nasty. Miczek has just put a sober male mouse into a cage with one such inebriated male. The drinker starts chasing the newcomer and attacks it within seconds. The sober mouse raises its front paws to ward off the teeth of its attacker and to signal its submissiveness. But the defensive postures and peace overtures are snubbed: By the time the two are separated 5 minutes later, the drunk has inflicted more than 20 bites on its cowering victim. Miczek is studying what neurochemical factors set the aggressive mouse apart. To do so he can plug a set of tubes into its brain, to decant minute fluid samples during a confrontation.

    Miczek thinks studies like this are highly relevant to a society in which alcohol figures in two out of every three violent crimes. Yet he and other researchers who study the biology of aggression find themselves struggling to make headway. Part of the problem is lukewarm support for their work. In 1992, a proposed conference on the genetics of aggression exploded into a major political row after Frederick Goodwin, then director of the Alcohol, Drug Abuse, and Mental Health Administration, likened inner city violence to a “jungle”; since then, the National Institutes of Health has quietly de-emphasized studies on the biology of violence (see p. 570). “It used to be a thriving area of research,” says Miczek. “But the whole field has been dismissed as irrelevant because of those remarks.”

    Further undermining their efforts are attacks from animal rights activists. Opposition from activists dried up funding for what had been a flourishing violence research school in the United Kingdom in the 1980s (see p. 572). As a result, the research community is tiny, amounting to a few hundred researchers worldwide, and it's hard to entice good young scientists, says Miczek. “Why keep banging your head against a brick wall, when there are much easier areas of research?” adds Adrian Raine, a neuropsychologist at the University of Southern California (USC) in Los Angeles.

    Despite these handicaps, the field has generated some interesting findings and hypotheses about how hormones, genes, and the brain control aggressive behavior. Researchers refer to these sets of discoveries as “stories”: There's the serotonin story, the Y chromosome story, the hypothalamus story, and many others. But it's still hard to see how these single, and sometimes controversial, pieces of the puzzle fit together, says behavioral geneticist Stephen Maxson of the University of Connecticut, Storrs, and progress is slow. And some are wary that if the field matures and starts producing new, specific antiaggression drugs, they could be abused by governments or doctors as a “quick fix” for violence in society, rather than addressing the social and economic problems that often underlie it.

    Tainted past

    That difficult position today stems in part from the field's checkered past. In the late 18th century, Viennese anatomist Franz Joseph Gall developed a theory called phrenology, which held that most human traits—including antisocial behavior—were regulated by specific brain regions. The larger a region's size—which could be determined from bumps on the skull overlying that area—the better developed the corresponding faculty. His theory was later discredited, as was a doctrine by the Italian criminal anthropologist Cesare Lombroso, almost a century later, who claimed that certain body features he called “stigmata”—such as a sloping forehead or asymmetrical facial bones—could give away criminal types. These and other notions brought the field into such disrepute that, even today, the mere whisper that one's constitution might predispose an individual to violence or crime is enough to raise hackles. In the first half of the 20th century, the widespread use of lobotomy (a procedure in which the frontal or temporal lobe is cleaved from the rest of the brain) to make people less aggressive and impulsive cast a cloud over clinical research on aggression as well, despite having won the Nobel Prize in medicine for its inventor, Portuguese neurosurgeon António Egas Moniz, in 1949.

    Baring its teeth.

    A rhesus macaque vents its rage.

    CREDIT: DEE HIGLEY

    In part because of this tainted legacy, modern-day violence researchers take pains to emphasize that one's genetic heritage isn't everything—rather, behavior is shaped by a subtle interplay among genes, environmental conditions, and life experiences. “That's the beauty of this work,” says Dee Higley, who studies aggression in rhesus macaques at the National Institute on Alcohol Abuse and Alcoholism in Poolesville, Maryland. “It shows you that genes matter, but they're not destiny.”

    Teasing apart the contributions of nature versus nurture, however, is a daunting task. One problem is finding animal models that mimic human violence yet are ethically acceptable. Popular techniques in the '60s and '70s included giving rats electrical shocks or applying heat to their footpads to elicit aggressive behavior—a model that many researchers now concede was cruel and, because it was so unnatural, hardly meaningful. “That was nutty stuff,” says Craig Ferris of the University of Massachusetts (UMass), Worcester. Instead, most researchers now use so-called ethological models in which they draw out more natural forms of aggression displayed by animals defending territory or establishing a dominance hierarchy. Ferris, for instance, pairs up male Syrian golden hamsters and monitors their aggressive confrontations. In monkey models, researchers are limited to studying “display behavior” where animals snarl and growl, but very rarely actually hurt each other (see sidebar below); eliciting more injurious forms of aggression is considered by most to be unethical. Although such dustups may seem only a distant echo of, say, a wife-beater or a bloodthirsty psychopath, researchers think some of the same neural pathways may be involved.

    Clinical researchers face formidable hurdles, too. For one, many studies involve unpleasant or time-consuming experimental procedures, such as spinal taps or brain scans. In addition, it's notoriously difficult to quantify a person's aggressiveness. “You could have a confederate frustrate a subject and see how they respond,” says University of Chicago psychiatrist Emile Coccaro. “But that's kind of dangerous. What if the guy hurts your confederate?” Instead, researchers query subjects on violent acts they committed in the past or seat them behind a computer and make them believe they're playing a game against an invisible opponent whom they can jolt with mild electrical shocks. The higher they are willing to crank up the voltage, the more aggressive they are presumed to be.

    The complex serotonin story

    Using such approaches, many research teams are focusing on neurotransmitters, the chemicals that ferry messages between neighboring brain cells. Without doubt the hottest one around is serotonin, which besides aggression has been blamed for a panoply of problems including depression and eating disorders. Numerous studies have found that aggressive animals, including humans, on average have lower levels of a serotonin metabolite—which is thought to reflect lower serotonin levels in the brain—in their cerebrospinal fluid (CSF). To demonstrate a causal relationship, researchers have given animals drugs that lower serotonin levels, which sometimes makes them more impulsive and aggressive, or increase levels, which has the opposite effect. “It's stronger than most stories in neuroscience,” says Coccaro.

    Murderous mind?

    A PET scan reveals less activity in the prefrontal cortex of a convicted killer (right) than in that of a nonviolent control.

    CREDIT: ADRIAN RAINE

    Even so, the serotonin story is complicated, says Miczek. Researchers know there are at least 14 receptors for serotonin in the brain, and they're currently studying what each of them does. Miczek, for instance, is working on a receptor subtype called 1B. Activating that with a drug seems to quell aggression in mice, rats, and monkeys, he says, and the receptor seems to be an interesting target for new drugs to treat violent behavior in humans. But strangely, the drug lowers serotonin levels in the brain rather than increasing them, presumably because 1B is a so-called presynaptic receptor, which decreases the amount of serotonin a brain cell churns out. That means although serotonin is involved in aggression, its role is far from clear-cut—a fact most researchers overlook. “Complexity never sells,” says Miczek. “A simple line is easier.”

    Nor do researchers know how many other neurotransmitters may be involved. Ferris, for instance, has found that aggression in his golden hamsters rises with increasing levels of another messenger, vasopressin, in a part of the hypothalamus. A study of 26 people with antisocial disorder that Ferris published with Coccaro in 1998 also found that higher levels of a vasopressin metabolite in their CSF correlated with a more aggressive past.

    Findings have led to similar confusion on the genetic front. Twin and adoption studies strongly suggest that violent behavior does have at least some genetic component; one dramatic example is a 1993 study of a Dutch family, some of whose male members engaged in all kinds of violence, from arson to attempted rape. Researchers traced their rages back to a very rare defect in the gene encoding an enzyme called MAOA, which breaks down neurotransmitters.

    In mice, researchers have pinpointed at least 15 genes (including the mouse version of MAOA) that seem to either heighten or reduce aggression. Some of their findings were published in high-profile journals and were hailed by the press as “aggression genes”—yet most researchers warn not to make too much of them. Several of the genes were identified after they had been knocked out in a popular lab mouse strain called 129, says Miczek. But that strain is known to be unusually pacific, he says, so any increase in aggression looks quite dramatic. However, the knockouts are sweethearts compared to some other lab strains or wild mice, he says, which begs the question of how relevant these genes are. Moreover, knockout studies can be misleading, warns Maxson; if, for example, a researcher disrupted a mouse's sense of smell, they'd also wreck its ability to communicate through pheromones, making it prone to end up in fights. That, however, wouldn't mean the researcher had found an “aggression gene.”

    Yet the mouse knockouts tantalize some researchers studying humans. In 1995, Randy Nelson of Johns Hopkins University in Baltimore reported in Nature that mice lacking the gene for nitrous oxide synthase (NOS), an enzyme that produces the gaseous neurotransmitter NO, are more aggressive than wild-type mice. Recently, he says, he was approached by researchers who want to check whether violent prison inmates have low levels of NOS, too. “It's an easy experiment,” says Nelson. But he says he's leery of doing the study, because he's not sure how NOS deficiency causes aggression and whether this happens in nature, too. Besides, the transgenic mice attack other mice from the back, says Nelson, reminiscent of how they catch a grasshopper; he wonders whether their aggression is a predatory behavior gone awry, which may have no human equivalent at all. “It's a very charged kind of thing,” says Nelson. “You want to have a decent intellectual fabric behind it before you go to humans.”

    The prefrontal cortex story

    Some researchers, meanwhile, wonder whether further animal studies are useful at all to understand human violence. “Animals don't rob banks, they don't rape people. They don't create the problems in society,” says USC's Raine. “What's really holding us back is research in humans.” Raine, who moved from Britain to Southern California in 1987—because, as he once wrote, “in addition to the good weather, there were plenty of murderers”—has another reason for sticking to humans: He believes violence is regulated in part by the prefrontal cortex, a brain area that's large in humans, but small in monkeys and minuscule in rodents. Researchers have suspected that this area plays an important role in controlling impulsive and violent behavior ever since the famous 19th century case of Phineas Gage, who changed from a kind, thoughtful man into a reckless brute after a horrible accident with an iron bar destroyed a large part of his prefrontal cortex.

    In a 1997 positron emission tomography (PET) scan study of 41 incarcerated murderers and 41 controls, Raine found that glucose metabolism in the prefrontal cortex was diminished in the murderers—suggesting that the area didn't work properly. A few months ago, he published a study in the Archives of General Psychiatry showing that in people with so-called antisocial personality disorder, of whom many had committed violent acts, the amount of gray matter was 11% less than in controls. But Raine has no idea how his results fit in with, say, the serotonin data. “We don't know,” he says, “because nobody has looked at low serotonin and frontal functioning in the same study.”

    Coccaro, too, has found evidence that violent and impulsive people are not as good at several tasks thought to involve the frontal cortex. For instance, when given the option of choosing from different card decks, some of which promise higher returns but in the end cause losses, they keep picking from the dangerous decks, whereas normal people learn to avoid them. They also tend to fail to recognize certain facial expressions, such as anger or disgust. Coccaro now wants to have them perform the same tasks while observing their brain activity using functional magnetic resonance imaging. “The hypothesis would be that the activity is less in the orbitofrontal areas,” he says. “But of course you never know what's going to light up until you put them in the scanner.”

    The promise and perils of treatment

    The ultimate goal of much of this work—besides satisfying intellectual curiosity—is to develop treatments for violent people. For instance, if the prefrontal cortex is at fault, says Raine, one future way to intervene may be to implant chips that somehow make up for its reduced function. Already, he notes, some biomedical engineers have predicted that the first electronic brain implants will become a reality within the next decade. “Forty years ago, we were chopping off the frontal cortex in violent people. In 50 years' time, we'll be doing the opposite,” Raine says. “We'll be doing reparative surgery.” But others dismiss the idea out of hand. “At this point, I would call that science fiction,” says Miczek. “We're dealing with very high-level processing, with circuits involved in making moral judgments. It's not a machine.”

    No trespassing.

    A male Syrian golden hamster attacks an intruder.

    CREDIT: CRAIG FERRIS

    Instead, most researchers are thinking of specific drugs to treat aggression. Right now, such therapies don't exist; violent mental patients are usually treated with high doses of antipsychotics that act on a neurotransmitter called dopamine. Those drugs quell aggression but also cause what Ferris of UMasscalls the “potted-plant syndrome”—they are heavily sedative and make patients lose interest in life. Ideally, a drug would inhibit aggression only, leaving other mental processes intact. Building on the serotonin evidence, several researchers have tried using so-called selective serotonin reuptake inhibitors, such as Prozac, which block serotonin's removal from the synapse. And although they did reduce aggression in human patients, they're still not as selective as researchers would like them to be. The new compounds acting on the 1B receptor are promising new candidates, says Miczek—although they too may turn out to have side effects.

    But even if such selective aggression drugs were developed, there are difficult regulatory and ethical issues looming on the horizon. At the moment, aggressive or violent behavior is not a separate entry in the Diagnostic and Statistic Manual IV, the bible of psychiatry; rather, it's considered a symptom that can occur in several mental illnesses. As a rule, the U.S. Food and Drug Administration (FDA) approves drugs only for clearly specified disorders, not symptoms. There are exceptions, such as pain or fever, but those are well defined, says Thomas Laughren, team leader at FDA's Psychiatric Drug Products Group. How aggression should be defined is a matter of debate. “We need some kind of consensus in the clinical and academic community that it's a real thing,” says Laughren, “and they have to define a reasonably homogeneous group of patients.”

    Indeed, this obstacle helped kill the only commercial program so far aimed at developing a class of aggression-reducing drugs, or “serenics.” In the 1970s, a Dutch company called Duphar (now part of Solvay) started a program to develop a drug that would be used to treat aggression in various mentally ill patients, such as manic- depressives, schizophrenics, and Alzheimer's patients. After screening thousands of compounds in animals, a team led by Berend Olivier identified one promising candidate called eltoprazine that seemed to have few side effects. (They later discovered it mimicked serotonin's action on two receptors.)

    In a trial among retarded people, eltoprazine reduced aggression better than a placebo. “It was really quite a nice compound,” says Olivier, now at a Connecticut company called Psychogenics, “and it would have been a good basis for further research.” All along, the company was aware that the drug would be politically sensitive—Olivier even received PR training to deal with the media—but when it became clear that the FDA would not allow it onto the U.S. market, Solvay halted the entire program in the early '90s, says Olivier.

    Another question is who would be treated with such drugs. Unlike most diseases, it's usually not the perpetrator who defines aggression as a problem, notes Laughren; it's the environment. Violent people may feel they are functioning normally, and some may even enjoy their occasional outbursts and resist treatment. Thus, the question would become who decides who needs the new drugs. “We would have to have a serious ethical debate about that,” says Laughren.

    Most researchers emphasize that they're not advocating drugging everybody who has ever committed a violent crime—or is deemed prone to do so. The most obvious patients, they say, are extremely violent people who are currently locked up in psychiatric wards and heavily sedated, or incarcerated in solitary confinement. But all agree that there will be a gray zone, and there may be a push to treat more and more people. Indeed, some have warned that violent incidents might be latched onto as a pretext for using drugs to pacify minorities. “The problem is that the public is going to say, “Why don't you give this to the disenfranchised part of society that's more violent because of their environment?” “says Ferris. “But we're not here to solve our social problems. That's a quagmire.”

  20. The Snarls and Sneers That Keep Violence at Bay

    1. Elizabeth Pennisi

    A bull elephant gores a rival in a duel over a female. A rat attacks and eats his mate's pups. In a fight for supremacy, one chimpanzee mauls another to death. Scenes like these may suggest that the ability to mete out violence is linked to survival in the animal kingdom. But a handful of researchers is now making a persuasive case that scores are settled far more often by subtle, nonviolent signals—a curled lip, a snarl, a swivel toward an opponent. Their provocative idea is that inflicting violence on a member of one's own species is a pathological condition that arises when these signals are missed or misinterpreted.

    Scientists are now unearthing clues to the behaviors that keep violent impulses at bay. Their findings suggest that personality, social status, life experience, and anxiety levels all factor into whether chance encounters end in peace or, more rarely, in carnage. “For many animals, a lot of social behavior is probing or testing a relationship, always testing with a little aggression and a little friendliness,” says Sergio Pellis, an ethologist at the University of Lethbridge in Alberta, Canada.

    A bully in the cage.

    A male rat strikes an offensive sideways position while two others rear up to defend themselves.

    CREDIT: R. J. BLANCHARD, CHRIS M. MARKAM, AND SEAN L. GALMON

    “Humans also have a tendency to restore relationships and resolve conflict,” says Frans de Waal, a primatologist at the Yerkes Regional Primate Research Center in Atlanta, Georgia. Laying bare the roots of violence in people, says Jaap Koolhaas, a behavioral physiologist at the University of Groningen in the Netherlands, may lie in understanding how we—like many other animals—rely on instinctual behaviors that keep violence in check.

    Coping strategies

    In the 1960s, the great German ethologist, Konrad Lorenz, made a case, extrapolating from his studies of birds and fish, that humans are an aggressive species and that a penchant for violence leads to clan and tribal warfare. But that dogma is beginning to fade, largely thanks to studies suggesting that aggressiveness is part of a repertoire of behaviors that has arisen to balance the need of the individual to look out for himself and still maintain good standing within the group.

    By the time Koolhaas delved into this area about a decade ago, researchers had shown that in many species, individuals are mainly polarized into aggressive or passive personality types. “There's a bimodal distribution,” he notes. Searching for cognitive differences that set these two personality profiles apart, Koolhaas's team found in 1993 that highly aggressive rodents are more likely than calmer peers to act quickly when thrust into a potentially life-threatening situation. He proposed the existence of an inborn tendency toward either a “proactive” or “reactive” coping strategy, which has since been discerned in organisms as diverse as birds, cattle, pigs, fish, and octopi. A typically reactive rat, for example, will sniff an electric probe introduced into its cage and get zapped just once, thereafter steering clear of the pain-giving intruder. A proactive rat, on the other hand, will actively try to cover it with the bedding of its cage. That type of reaction may seem wasteful, but proactive coping strategies can boost the odds of survival: Put in a chilly cage, proactive rats build nests to stay warm, while reactive rats sit in a corner and shiver.

    Proactive animals also tend to be less easily deflected once they learn a behavior, Koolhaas and his colleagues have found. “The highly aggressive animal tends to develop routines,” he says. “With repeated experience, these animals don't pay any attention to environmental stimuli.” At the XIV World Meeting of the International Society for Research on Aggression in Valencia, Spain, earlier this month, Koolhaas detailed experiments on rodents in mazes and in social situations. His group has found that proactive mice and rats that have learned to navigate a maze to get to a food reward are unperturbed when the maze is slightly altered—such as when tape is placed at a spot on the floor. They keep going as if the tape weren't there. Reactive individuals, on the other hand, stop and check out the new landmark. Then, as if confused by the tape, they often lose their way to the food, even though the course is otherwise the same.

    This tendency shows up in social interactions, too. Even rodents that have never seen a member of the opposite sex can quickly tell a male from a female. Males tend to be nasty toward other males when first introduced, whereas they usually nuzzle new females. But if a solitary, proactive male is presented with another male several times in a row, then is introduced to a female, he will fail to notice the gender switch and will attack the female. “It first does and then thinks,” says Koolhaas, whereas a reactive individual “first thinks and then does. It's a crucial difference, and it leads ultimately to violence.”

    Once a highly proactive animal for the first time wins a series of aggressive encounters and repels an intruder, it acquires a fighting habit, and “its behavior does not depend anymore on what the opponent is doing,” Koolhaas says. It ignores peace overtures, such as baring the belly. But such highly proactive animals are rare: Most aggressive encounters between a solitary rodent and a newcomer end up with the guest showing the proper deference followed by détente or even playfulness. In observing the highly proactive animal's failure to take into account what the opponent is doing, Koolhaas says, “we have hit upon one of the causes of the development of violence.”

    Personality and rank

    “Type A” behavior is not limited to rodents, says Robert Sapolsky, a neuroendocrinologist at Stanford University in California. He has observed several troops of baboons in their native habitats for almost 20 years. “Personality can be at least as important as social rank” in determining how well an individual fits in, and consequently, the likelihood of getting into fights, he says. Like Type A humans, who tend to perceive a hostile world around them, Type A baboons have higher levels of stress hormones. And if a male Type A baboon can't tell the difference between a minor provocation and a major power struggle—inferring a threat, for instance, from a subordinate that happens to be napping too close to his turf—he's “less likely to remain dominant,” Sapolsky says.

    Although baboons are considered quite aggressive primates by nature, violence tends to be much more prevalent when the troop's structure is unstable, such as when baboons are kept in captivity. If the troop members don't know one another, or if the hierarchy is disturbed, say, by the loss of the top male, the rush to establish rank results in scuffles that only subside when everyone knows his or her place. “Aggression has something to do with attaining a high rank, but far less to do with retaining it,” Sapolsky says. Once dominance is established, violent actions subside, replaced by subtle signals—a sideways glance, for example, or a slight but perceptible tensing of the body.

    Should a squabble occur, animals know how to make amends. In a new book, Natural Conflict Resolution, de Waal and Filippo Aurelli chronicle peacemaking in animals and in people: After an aggressive encounter over food, for example, animal combatants often offer each other friendly overtures, such as grooming or licking one another. Children, too, kiss and make up. Such conciliatory gestures are “extremely widespread,” de Waal reports, as they help preserve relationships that may be necessary to survival and can be particularly key when animals (or people) live in crowded conditions. (See de Waal's Review on p. 586.)

    These gestures acquire meaning, it seems, during critical developmental windows. By play fighting, young rats, for example, learn how to interact with one another. Both proactive and reactive coping strategies are moderated by these experiences. As Koolhaas's team reported in the March 1999 issue of Developmental Psychology, depriving adolescent rats of just 2 weeks of contact—and, consequently, play fighting—transformed them into maladapted individuals who had serious problems dealing with their peers. Isolation, he argues, prevents an individual from “learning to play the game according to the rules.”

    Those results are in line with what Pellis has found in 20 years of observing play fighting among rats. His team has gleaned the subtle changes in posture and movements that distinguish “play” from something more serious. A rat lacking play experiences is likely to be overly friendly when he first encounters another animal, sniffing it with gusto. “But when the other animal reciprocates, it exhibits hyperdefensiveness,” backing off or trying to nip the face, Pellis says. Such behavior seems odd to a rat that has had a normal rough-and-tumble youth, and the miscommunication can result in escalated aggression as both animals get excited and drawn into a fight. This, says Pellis, suggests that “social skills are especially sensitive to what happens in juveniles.”

    Studies on nonhuman primates support the idea that play fighting is not so much a way to learn combat skills as a way to develop social intuition. According to Pellis, adults in species with irregular contact are more playful than adults of species that live in close-knit groups. Play fighting involves a lot of sniffing and touching, and animals that spend much time together have substituted more sophisticated signals that are often visual or vocal.

    Even so, Pellis says, grooming among nonhuman primates suggests that touching still plays an important role in defusing tensions. For example, when primates are fed on a regular schedule, they develop a habit of grooming one another before the food arrives, possibly to ease tensions that might escalate over access to the upcoming meal.

    The thought of schoolchildren combing each other's hair to calm down before lunch may seem odd. But animal studies are relevant to people, says Pellis. Children have a biological need to play, he points out, and perhaps even to get into minor scuffles with their peers. That may be how they learn subtle social cues. “Preventing kids from doing it may be causing them harm,” he says. “They may be less able to deal with subtle interactions.”

    Neither Pellis nor others studying aggression in animals think their work will fully explain why people on occasion turn violent. Nevertheless, “the distance between us and animals may be smaller than we like,” says Menno Kruk, a behavioral neurobiologist at Leiden University in the Netherlands. “Actually too close for comfort sometimes.”

  21. The Violence of the Lambs

    1. Constance Holden

    Researchers view violence as a complex interplay of social and biological risk factors, some of which may show up in infancy

    Signs that “Steve” was headed for big trouble were there from very early on if anyone had cared to read them. Born to an alcoholic teen mother who raised him with an abusive alcoholic stepfather, Steve was hyperactive, irritable, and disobedient as a toddler, according to his mother.

    Steve, first interviewed by researchers at the age of 13, is a member of a university-run longitudinal study investigating how delinquent behavior develops in boys. After dropping out of school at age 14, Steve spent his teen years fighting, stealing, taking drugs, and beating up girlfriends. His mother, according to the researcher who interviewed her, “did not seem to be aware of the son's behavioral problems.” School counseling, a probation officer, and meetings with child protective services failed to forestall disaster: At 19, several weeks after his last interview with researchers, Steve visited a girlfriend who had recently dumped him, found her with another man, and shot him to death. The same day he tried to kill himself. Now he's serving a life sentence without parole.

    Getting an early start.

    Violence comes naturally to babies, but after age 2 most children learn how to express themselves or settle disputes in other ways.

    CREDIT: R. T. TREMBLAY ET AL.

    Scientists are now refining their views about what sets people like Steve on a path toward violence. Until a decade or so ago, most social scientists thought violent individuals were almost invariably the products of an abusive environment. Recent findings, however, reveal that violent tendencies often show up in infancy, suggesting prenatal roots. Researchers are increasingly coming to view violence as the end result of multiple risk factors that may include a biological vulnerability—either genetic or created in the prenatal environment—that can be brought out or reinforced by the social environment.

    In Steve's case, he had just about every risk factor in the book. His hyperactive behavior also contributed to his inability to concentrate at school; he quickly fell behind, becoming alienated from peers who might have given him some of the social structure he lacked at home. He became increasingly emotionally disturbed—anxious, angry, confused, prone to fantasies, and self- destructive. His deviant behaviors and emotions were reinforced by hanging around with others who engaged in the same behaviors.

    Chronically violent children such as Steve are rare: Experts guess fewer than 2% of boys and far fewer girls. And most violent children do not become violent adults. But “in the majority of the most seriously violent cases, the behavioral problems go back to early childhood years,” says psychologist Rolf Loeber of the University of Pittsburgh.

    Recipe for violence

    Scientists have long known that damage to certain brain regions can result in violent behavior. The classic case is Phineas Gage, a railroad worker who survived a horrific accident in 1848 when a charge he was setting exploded prematurely, sending an iron tamping bar through his forehead. Although it destroyed a large chunk of his frontal lobe, the accident left Gage's motor and cognitive functioning unimpaired. However, it transformed him from a polite, model employee into a violent, irresponsible man (Science, 20 May 1994, p. 1102).

    Most violent individuals have no obvious brain damage. But twin and adoption studies over the past 2 decades have shown that genes influence many traits associated with chronically violent behavior. Impulsivity is one. Another is oppositional temperament—angry, vindictive, resistant to control, deliberately annoying, and blaming others. Callousness, or lack of empathy, is a trait that often shows up in very young children as cruelty to animals. The studies suggest that about half the variation in these propensities can be chalked up to genes, says psychologist David Rowe of the University of Arizona, Tucson. Attention deficit hyperactivity disorder (ADHD) exacerbates these tendencies. Low IQ is another important risk factor.

    Most violent youths do not become murderers like Steve, but they often share elements of his story. “One thing seems to hold up no matter what” throughout various longitudinal studies, says Harvard psychiatrist Enrico Mezzacappa: “Aggression is a fairly stable phenomenon.” Children who become chronically violent adults generally are difficult from early childhood. And without early intervention on multiple fronts, their antisocial tendencies are likely to be reinforced in a downward spiral of confusion, alienation, and rage. But just which early risk factors are most powerful, and how they interact, is proving very tough to sort out.

    One ongoing study that focuses on social and environmental risk factors is a project run by Alan Sroufe's group at the University of Minnesota, Twin Cities, exploring the development of violent delinquent behavior. His group has been following 185 “high-risk” impoverished children for 23 years, starting from before birth when mothers were patients at a public health clinic. His colleague Byron Egeland says their research points to bad parenting as the single most important risk factor for violence, and that “[innate] temperament doesn't play that big a role.” Hyperactive, violent children, he says, “are out of control because they never developed emotional [control] and self-regulation.” Usually, he says, that's because they lack “a good relationship with a caregiver” and, later on, are influenced by deviant peers. Those who had formed good attachments with their mothers (as observed in lab situations) were significantly less likely to get arrested for assault before age 17.

    Support for this connection comes from psychologist Richard Tremblay of the University of Montreal, whose group has followed 1000 Montreal boys since 1984 when they were 6, at which time about a third were identified as physically aggressive. Most of the boys had calmed down by age 12, but 4% continued to be chronically violent into their teens. Tremblay and psychologist Daniel Nagin of Carnegie Mellon University in Pittsburgh found two risk factors in this group: Their mothers were generally less educated, and they had delivered their sons at an earlier age. Tremblay thinks this is indicative of mothers who “lack the capacity to effectively socialize a difficult child.” Such mothers are often depressed and easily feel overwhelmed by their new responsibilities.

    But unlike Sroufe, Tremblay says such long-term studies cannot reveal which factors originally set in motion destructive behavior patterns. A child may be violent and irresponsible because his mother or closest caregiver treated him badly, or because he inherited characteristics such as impulsivity and low IQ. In either case, the child may be evoking angry and frustrated reactions from his parents—responses that only drive him further astray.

    In cold blood

    One researcher who is using a psychophysiological approach to try to tease out some of the basic ingredients for violent behavior is Adrian Raine of the University of Southern California in Los Angeles, whose group is following nearly 1800 children in Mauritius. In 1997, the team reported that children with low heart rates at age 3 were more likely to be physically aggressive at age 11. In another study, boys who had had lower skin conductance (a measure of sweating) and slower brain waves were more likely than others to be later arrested for felonies.

    Raine suspects that what he's seeing is a biological “marker” for a propensity for violence: low autonomic arousal or a sluggish nervous system. This, he says, makes people less anxious and inhibited and raises their threshold for arousal—which, he speculates, means that they have to go to great lengths to feel stimulated. If a person is impulsive and callous as well, it's a poisonous combination. University of Pittsburgh psychologist Dan Shaw cites as an extreme manifestation of low arousability serial murderer Gary Gilmore, a classic psychopath who would get his kicks in prison by sticking his finger in a live electrical socket.

    Raine's findings are controversial, and Loeber says they are difficult to interpret, because there has been “no integration” of such findings with factors such as upbringing, school performance, and influence by peers. Nonetheless, David Farrington of the Institute of Criminology in Cambridge, England, believes that in both cold-blooded predators such as Gilmore, as well as more hot-blooded “reactive” types such as Steve, there is some kind of disregulation in the central nervous system.

    Mezzacappa agrees. In a vulnerable individual, early abuse and neglect can alter the stress response—thus activating “the predisposing biological substrates for being emotionally reactive [and impulsive],” he says. If the stress is severe enough, circuits get overloaded and shut down—which may lead to Gilmore-style “emotional blunting,” Mezzacappa believes.

    Reaching out

    The implications of these studies for preventing the young Steves of the world from growing into violent adults are principles parents and teachers have always known: People need both consistent love and consistent discipline to grow up as well-socialized human beings. But just how that happens is still not clear, says Tremblay. He points out that if you forget about the consequences of aggression and just look at the behavior itself, the most violent age of all is 2. “Babies do not kill each other, because we do not give them access to knives and guns,” he says. “The question … we've been trying to answer for the past 30 years is how do children learn to aggress. [But] that's the wrong question. The right question is how do they learn not to aggress.”

    There are some, such as Joshua Andrews, who need special help in learning not to aggress. By the time Joshua had reached the age of 2, says his mother, Susan Andrews, he would bolt out of the house and into traffic. He kicked and head-butted relatives and friends. He poked the family hamster with a pencil and tried to strangle it. He threw regular temper tantrums and would stage toy-throwing frenzies. “At one point he was hurting himself—banging his head against a wall, pinching himself,” not to mention leaping off the refrigerator, says Andrews. Showering Joshua with love, she says, made little difference: By age 3, his behavior got him kicked out of his preschool. Later that year, he got a triple diagnosis—ADHD, impulse control disorder, and oppositional disorder. After several months of failed therapy, Andrews found a psychiatrist, Mezzacappa, who was able to throw Joshua a lifeline—a combination of medication and psychotherapy— that might just have rescued him from really hurting himself or someone else.

    Mezzacappa put the boy on two antidepressants, Wellbutrin and Trazodone, which operate on different brain chemicals known to be related to attention, impulsivity, and violence. The medication is combined with a rigorous regimen of therapy, a special program at school, and behavior modification at home. “I'm very strict,” says Andrews. Although she gives him plenty of affection, “the softer, kinder approach didn't work with him.” She makes it very clear what the consequences of hitting or other bad behavior will be—such as “time out” or a period of toy or TV deprivation. Now, thanks to medication, he will sit still long enough to let someone read to him, and he's learned counting and the alphabet. “There's no doubt in my mind that without a proactive mother and extensive therapy Joshua would turn into a violent delinquent,” says Mezzacappa. But now, says Andrews, “I have no doubt he'll grow up to be a normal, healthy little boy.” Joshua is one of the lucky ones.

  22. Has America's Tide of Violence Receded for Good?

    1. Laura Helmuth

    Experts in the young field of violence epidemiology blame guns and crack cocaine for America's deadly crime surge in the early 1990s. Explaining the subsequent decline in violent crime rates has been harder

    Violence in America is often discussed at the intellectual level of call-in radio. Anyone with access to a soapbox or an Internet site can pontificate about the virtues of clean living or a well-armed populace. But recently, out of this opinionated chatter a few empirical signals have begun to emerge.

    Scientists are attempting to apply well-honed analytical tools—methods developed for studying economics, epidemiology, and sociology—to make some sense of senseless violence. They are examining the effects on public life of such environmental trends as tougher gun laws, more sophisticated local policing strategies, huge increases in the rate of imprisonment, and the destructive crack epidemic that tore through U.S. cities a decade ago.

    Two dramatic swings in violent crime over the past 15 years provided most of the fodder for this research: Starting in about 1985, crime rates shot up, peaking in the early 1990s. A cooling-off period followed, and it continues to this day.

    Slaughter by the innocents.

    While adult homicide rates dropped in the early '90s, young people went on a murder binge.

    SOURCE: SOURCEBOOK OF CRIMINAL JUSTICE STATISTICS

    Like floodwaters receding, the number of violent crimes in the United States has ebbed for the past 8 years. In the FBI's preliminary report of crime for 1999, murder was down 8% compared with 1998; rape decreased by 7%, assault by 7%, and robbery by 8%. However, these drops were less dramatic for cities with more than 500,000 people—worrying violence researchers, who know that big cities set the trend for the rest of the country. Experts say the need to understand crime trends is now even more urgent.

    Violent crime rates have been on a roller coaster for years. They peaked in the early 1960s, 1970s, 1980s, and 1990s. Are we due for another lurch up? Some of the factors that seem to have helped squelch crime could be temporary, such as low unemployment rates. But others, including a growing intolerance for violence as a means of settling interpersonal disputes, seem to have become cultural norms.

    Nearly all the research literature on the epidemiology of violence is less than a decade old, says Garen Wintemute of the University of California, Davis, who studies the effects of gun laws. He thinks violent crime could continue to fall. “The difference this time,” compared with other recent dips in violent crime rates, he says, “is that as rates have fallen, a great deal of effort has been expended on figuring out why.” And understanding why, in some cases, can help shape policies that thwart violence.

    Babes with guns

    The most recent U.S. murder binge began in the mid-1980s in big cities such as Los Angeles, New York, and Washington, D.C., before spreading quickly to the rest of the country. It was a phenomenon of kids, says criminologist Alfred Blumstein of Carnegie Mellon University in Pittsburgh and director of the National Consortium on Violence Research—kids with guns. While for decades adults have been committing fewer murders, per capita, homicides by people younger than 20 more than doubled between 1985 and 1991 (see chart). In many countries, teenagers are more violent than people in other age groups, says Frank Zimring, a law professor at the University of California, Berkeley. But within this age group, U.S. teens are far more violent than their foreign peers. For example, U.S. 17-year-olds committed seven times more violent crimes during the mid-1990s than people of the same age in the United Kingdom.

    Fueling this epidemic of violence, experts agree, was crack cocaine. The new drug was cheap enough to addict new users and profitable enough to draw a torrent of new dealers, says economist Jeff Grogger of the University of California, Los Angeles—and that set off turf wars. Disputes among rival dealers could not be settled civilly in court. Violence, Grogger says, is one way to “enforce property rights in the absence of legal recourse.”

    To estimate crack's impact on murder rates, Grogger gathered data on the introduction of crack into inner cities and subsequent homicide rates. He surveyed police chiefs in 27 cities to find out when they first noticed the emergence of the crack market and also used National Institute on Drug Abuse statistics on emergency-room admissions to spot increases in crack-related overdoses. As Grogger had suspected, crack hit the big cities first, arriving on the streets of Los Angeles around 1984, hopping to San Diego in 1985, then spreading north and east to San Francisco, Tampa, and New York. Milwaukee was the last city in the survey blighted by crack, in 1991. In some cities, murder rates started climbing the year after crack arrived. Grogger estimates that the violent crime peak in 1991 would have been about 10% lower nationwide—or 5% below the 1981 peak—had crack never appeared.

    In one unintended consequence of the war on drugs, cracking down on crack markets may have actually increased murder rates, says Blumstein. As older, established dealers were locked up, the demand for crack drew younger and younger people, mostly African Americans from inner cities, into the business—although he cautions that it's still an “open question” how many young dealers were drawn to the market because of increased demand for crack compared with how many replaced imprisoned older dealers. The problem, Blumstein explains, is that “teenage males are not the world's best dispute resolvers—they've always fought.” And the presence of guns makes typical confrontations deadly.

    Throwing away the key.

    A quadrupled incarceration rate might be helping keep America a safer place to live, for now.

    SOURCE: SOURCEBOOK OF CRIMINAL JUSTICE STATISTICS

    Homicide rates rise in such situations because a high concentration of guns in a neighborhood creates an “ecology of danger,” says criminologist Jeffrey Fagan of Columbia University. He interviewed 400 young men in New York's most dangerous neighborhoods and found that violence seemed to be contagious: Gun use spread through neighborhoods from 1985 to 1995 like a communicable disease. Young people who otherwise wouldn't carry guns felt that they had to in order to avoid being victimized by their armed peers, they reported. The guns skewed otherwise normal adolescent posturing, Fagan says, and tinged every social interaction with the threat of death. “Ultimately, guns are a form of social toxin,” he says.

    Shootings account for most of the slayings during the recent murder binge, and they account for all of the increase in homicides. A domestic arms race between police and criminals led to a proliferation of higher caliber, faster firing guns. And the lucrative crack market meant that “the bad guys could afford expensive, high-capacity weapons,” says Wintemute. As a result, he says, shooting victims came into emergency rooms with more entry wounds and were more likely to die from their injuries.

    The grisly shooting spree prompted former White House drug czar William J. Bennett and criminologist John DiIulio Jr. to warn that U.S. inner cities were spawning a generation of “superpredators,” armed to the hilt with little notion of right and wrong. But with crime rates down, talk of superpredators has petered out. For good reason, says Blumstein: They never existed. “There's no evidence that [the murder binge was caused by] a new generation of notoriously different kids,” he says.

    Up the river

    The triggers of the last decade's crime spree are fairly well understood, but there's less agreement about why violence has since abated. Researchers point to factors such as a healthy economy, stricter gun laws, and a massive dose of imprisonment.

    The United States is in the midst of an “incarceration binge,” says Blumstein. From the 1920s to the 1970s, about 110 people per 100,000 were imprisoned. That rate has since quadrupled (see chart below)—but not because police are solving more crimes. The rise is fueled instead by longer prison sentences, higher odds that a convict will serve time, and a massive increase in the number of people arrested for drug offenses.

    To estimate whether the increased incarceration rate has helped reduce violent crime, criminologist Richard Rosenfeld of the University of Missouri, St. Louis, looked at homicide rates in the most dangerous neighborhoods of St. Louis and Chicago. Prisoners disproportionately come from those neighborhoods, so he assumed that their homicide rates would give a reasonable estimate of how many murders prisoners would commit if they were roaming the streets. For every 10% increase in the number of people incarcerated, he concluded, homicides decrease by about 15% to 20%.

    Others have worked up more modest estimates. Economist Steve Levitt of the University of Chicago explored the effects of lawsuits brought by the American Civil Liberties Union against several states during the 1980s that aimed to ease overcrowding in prisons. In some cases, judges ordered states to cut their prison populations, while in other states prison populations continued to swell. Comparing these states, Levitt found that a 10% relative drop in prison population led to about a 4% increase in crime.

    Rosenfeld cautions against viewing incarceration as a panacea for violent crime. In the short term, he says, it provides an “incapacitation” effect, preventing prisoners from committing additional crimes. But in the long term, high incarceration rates could amplify homicide rates. “The scenario could be that the massive incarceration experiment has generated violence” by disrupting families and neighborhoods or thwarting the ability of released prisoners to join the legitimate labor market, Rosenfeld speculates.

    Imprisonment alone can't explain the recent drop in violent crime, Levitt says, because the boom in incarceration started in the 1970s and crime rates started falling almost 20 years later. He and John Donohue III of Stanford Law School point to a surprising factor: the legalization of abortion. The psychological literature shows that unwanted children are more likely to commit crimes, they contend, and demographic data suggest that women who have abortions are disproportionately young and poor—subpopulations whose children are at relatively high risk for committing crimes. Without 1973's Roe v. Wade decision, the researchers reason, more potentially violent children would have reached their peak crime years beginning in about 1991—when crime rates started dropping. They estimate that legalized abortion accounts for 50% of the recent drop in crime.

    “It takes great skill to write a paper that infuriates both the left and the right,” says Blumstein of Levitt and Donohue's idea, “and I think they've done that brilliantly.” But he cautions that they haven't addressed other factors that contribute to their 50% estimate. “To reach a conclusion as aggressive as this takes a much more subtle analysis,” says Blumstein.

    Although incarceration and abortion are debatable contributors to the decline in violent crime, perhaps the most important factor is a drop in the number of guns on the streets. Increased police aggressiveness in pursuing illegal guns led to a rise in weapons arrests until 1993, says Blumstein. Weapons arrests have since declined—probably not because police are paying less attention to weapons, but because fewer people are carrying them.

    In Kansas City and Indianapolis, uniformed police worked overtime to seize more guns in some neighborhoods while continuing standard practices in others. Gun-related crime dropped 49% in the targeted Kansas City areas and 50% and 22% in the two areas of Indianapolis that received extra gun-oriented patrolling. Charleston, South Carolina, experimented with a bounty program offering payment for tips on illegal weapons; the program inhibited the brandishing of weapons and probably helped cut down the city's homicide rate, says Blumstein. Another major force in keeping guns off the streets is the Brady bill, which went into effect in 1994, that requires people to undergo a background check before they are allowed to purchase a gun from a licensed dealer. Convicted felons are prohibited from buying guns. In California and other states, more rigorous background-check systems deny guns to people convicted of misdemeanors involving violence.

    Critics of the Brady bill, particularly the National Rifle Association, contend that such laws are counterproductive, as those who want guns can still get them illegally. To investigate whether the bill has had an impact on violent crime, Wintemute compared two groups of people: those who tried to buy weapons after the Brady bill went into effect but were denied due to a prior felony conviction and those who had been arrested for—but not convicted of—a felony and therefore were allowed to buy a gun. He analyzed their arrest records for the following 3 years. After adjusting for demographic variables and prior arrest records, he found that the people who bought guns were 25% more likely to commit crimes involving guns or violence.

    However, sociologist Lawrence Sherman of the University of Pennsylvania in Philadelphia cautions that even the best background checks can't stop most gun violence, because statistics show that most gun homicides are committed by people with minimal criminal records. Sherman dismisses another popular gun-control strategy: gun buyback programs. Three studies have tracked gun violence in communities where large numbers of guns were purchased; in none did the buybacks appear to quench crime rates. He says that buybacks should be more targeted, focusing on high-crime neighborhoods and recent gun models.

    For whatever reason, Wintemute says, “gun sales have gone right through the floor” in the 1990s. Fear sells guns, he says, and it's possible that declining crime rates have inspired fewer people to buy guns—which further reduces the crime rate and makes people feel less threatened. Gun sales did nudge upward in 1999, but he suspects that was a Y2K paranoia-related fluke. “The gun industry marketed the heck out of Y2K,” he says.

    The drug world has also quieted down, with fewer new drug users entering the market and thus reducing the demand for new dealers, says Blumstein. And thanks to the booming economy, unemployment is down to levels not seen since the 1970s, he says, even among high-school dropouts, teenagers, and racial minorities—people at highest risk for entering the drug market.

    Future trends

    The FBI Unified Crime Statistics report for 1999 suggests that the recent crime drop may be slowing in big cities. Blumstein used the FBI's numbers to examine murder rates for 17 large cities during 1998 and 1999. Of those, 12 had fewer homicides in 1999 but five had more. Big cities had the first crime booms and led the way in crime drops; Blumstein suggests we might now be seeing a leveling off of violent crime rates in big cities, and other regions may soon follow suit.

    Rosenfeld holds out hope that some factors pushing homicide rates down are here to stay. He speculates that the United States is currently undergoing a “civilizing process.” Historians have traced the rise of a cultural intolerance for interpersonal violence in Europe, he says, and the same phenomenon seems to be happening here. He sees it not as a moral process so much as an aesthetic change in which people disapprove of resorting to violence to settle disputes.

    Sociologist Murray Straus of the University of New Hampshire, Durham, also discerns evidence that violence is increasingly not tolerated in the United States. He's seen substantial decreases in some forms of family violence since 1975. Fewer people report hitting their adolescent children nowadays, although 94% of parents still physically punish their toddlers. Fewer husbands hit their wives, down from 11% to 7% per year between 1975 and 1995, although 11% of wives continue to hit their husbands.

    The most harrowing metric of domestic violence, what the FBI calls “intimate partner homicides,” has also declined steadily in recent decades. The number of such slayings in 1996 (1842) was 36% lower than the total in 1976. Rosenfeld attributes much of this decline to decreased marriage rates. People are marrying later and divorcing more often, and thus are less likely to be trapped in violent relationships, he says. Marriage rates are continuing to decline, Rosenfeld adds, which will probably continue to dampen the rates of intimate partner homicides.

    Not all factors that influence crime rates can be easily changed—“we can't institute a robust economy,” says Wintemute. But the past decade's worth of research on the causes and inhibitors of violent crime offer hope that society can reinforce those factors that squelch violence, he says. Blumstein complains that currently, criminal justice policies “are driven by ideology instead of scientific knowledge.” But as the field of violence research matures, he predicts, its findings will “eventually become compelling in the public debate.”

  23. What Makes a Police Officer a Victim?

    1. Erik Stokstad

    The court hearing for armed robbery had finished, and the 26-year-old defendant was headed back to jail. Everyone knew he was dangerous. The man had a long rap sheet that revealed only a fraction of the crimes he would later admit to, including scores of car thefts, some 40 burglaries, and six armed robberies. But the deputy sheriff charged with transporting the prisoner was kind. He told the lanky defendant that because he was too tall to sit comfortably in the squad car's rear security cage, he could ride handcuffed in the front seat. For this small favor, the deputy said, all the defendant had to do was promise that he “would be good.”

    As they drove, the officer chatted constantly with the prisoner. He stopped the car when he saw two women walking away from a disabled vehicle on the side of the road, and he offered them a ride to town. But when he leaned back to unlock the rear door, the prisoner lunged and grabbed the gun from his holster. After a struggle the prisoner jammed his foot against the officer's throat, pinning him against the door. An instant later, he shot him in the chest. The identities of the victim and the perpetrator and the tragedy's location are kept secret, but the outcome is known: The 28-year-old officer died on the scene.

    While many professions are hazardous to health, few jobs run a greater risk of enduring or inflicting violence than law enforcement. In the United States, about 150 officers die every year in the line of duty, and government figures suggest that some 350 suspected criminals are justifiably killed in confrontations with police. Seeking lessons behind these gruesome statistics and other disturbing trends in violent behavior nationwide (see main text) are researchers at the Federal Bureau of Investigation (FBI).

    The FBI is perhaps best known for “profiling” serial killers—the kind of work made famous by the film The Silence of the Lambs. Beginning in 1979, a handful of agents at the Behavioral Science Unit at the FBI Academy in Quantico, Virginia, interviewed 36 imprisoned serial killers, from Charles Manson to Ted Bundy. They found behaviors, such as a tendency to return to crime scenes, that have helped law enforcement agencies around the country crack unsolved cases.

    Profiling is now part of the FBI's National Center for the Analysis of Violent Crime in Quantico, which also studies everything from sexual killings of the elderly to serial bombings. With 13 full-time agents and crime analysts, it's the largest center in the United States that concentrates on finding ways to catch criminals, says William Hagmaier, who heads the unit's Child Abduction and Serial Murder Investigative Resources Center. “What we're trying to do,” Hagmaier explains, “is learn all we can about criminals and some of their most heinous behaviors and share those insights with local investigators.”

    Unarmed victims aren't the only research subjects. In a unique study, an academy team has overturned common views of the kind of officer most likely to perish while enforcing the law. After interviewing offenders and colleagues of fallen officers, Anthony Pinizzotto and Edward Davis of the academy's Behavioral Science Unit identified, for the first time, behavioral traits of police officers likely to be killed. Their description of a classic victim surprises even the researchers themselves: Like the young deputy slain on the drive from the courthouse, police officers most at risk of death, they found, are ones who are friendly, trusting, reluctant to use force, and less likely to follow police procedures. Although critics point to several shortcomings in the study's design, the FBI researchers claim that findings from their ongoing 10-year-old project have already improved police training around the country.

    Pinizzotto and Davis first selected 50 incidents between 1975 and 1985 in which a total of 54 police officers were killed while carrying out their duties—often while making an arrest or reacting to a crime in progress. The pair pored over court records, talked to police department colleagues of the slain officers, and traveled to 18 states to interview offenders. The duo had expected to find that aggressive policing triggered the murders. “It was a surprise,” Pinizzotto says. “Not one offender said they were being abused or pushed to their limit.” And police department colleagues described the fallen officers as good natured and conservative in their use of force.

    The deadly situations tended to erupt when officers didn't follow the book. Many of the homicides took place, for example, when officers failed to call for backup while checking out a suspicious situation, didn't identify themselves as police officers, or neglected to call in a license plate number during a routine traffic stop. In two-thirds of the cases, the officer either failed to physically control the suspect or let the situation get out of hand. When combined with a violent offender looking for a chance to escape, such a situation can be deadly.

    To prevent such deaths, the FBI issued a report, “Killed in the Line of Duty,” that recommends better training. For instance, it advises police departments to train officers in ways to prevent suspects from grabbing police weapons and how to face a drawn gun—although Davis and Pinizzotto decline to give details to avoid giving criminals any tips. The report is not intended for public distribution, but about 150,000 copies were sent a few years ago to police departments around the country. “I teach a police class, and I always use their recommendations,” says Laure Brooks, a criminologist at the University of Maryland, College Park.

    But Brooks and other criminologists have concerns about the way the study was done. For example, there's no way to know whether the dead officers were really friendlier, harder working, or less likely to follow the rules than their peers, who were not assessed in the study. “Measuring the personality of dead cops is useless unless you're also measuring the personality of living cops,” says William King, a criminologist at Bowling Green State University in Ohio. The assessment may also have been confounded by a halo effect, he adds: “How many of these cops are going to say bad things about a fallen comrade?”

    Pinizzotto and Davis realize these shortcomings but say their conclusions are nonetheless valuable for training. Since then, they have addressed some of these issues and have drawn a more detailed picture of the personalities and behaviors of officers who were victims of violent assaults—but who lived to tell the FBI researchers about it. They selected 40 cases, interviewing the officers—most of whom had been shot—and their assailants. In their latest report, “In the Line of Fire,” Pinizzotto and Davis noted that many survivors recalled knowing when not to use deadly force, but they couldn't recall being instructed on when it is appropriate.

    Because many of the officers failed to realize that an escalating confrontation was potentially life-threatening, the FBI researchers are now examining how officers tell when they are in danger. They are also touring the country, lecturing to police departments about their findings. Their presentation includes videotaped interviews. “When you hear from a perpetrator who has killed an officer in cold blood,” Davis says, “that has a chilling effect that most officers don't forget.”

Log in to view full text

Log in through your institution

Log in through your institution