News this Week

Science  16 Jan 1998:
Vol. 279, Issue 5349, pp. 319
  1. NEUROPHYSIOLOGY

    Teaching the Spinal Cord to Walk

    1. Ingrid Wickelgren

    A flurry of recent work suggests that, with proper training, some patients with spinal cord injuries can regain at least a limited ability to walk

    Two years ago, 27-year-old Thorsten Sauer grabbed a therapist's hand and took his first steps in 6 years. At the time, he had been confined to a wheelchair since the 1989 motorcycle accident that had partially torn his spinal cord, leaving him almost totally paralyzed from the ribs down. But in 1995, prompted by a television news program, Sauer traveled from his hometown of Erlangen, Germany, to participate in an experimental program run by neurophysiologist Anton Wernig of the University of Bonn. At Wernig's clinic, located near Karlsruhe, a therapist hoisted Sauer and helped him walk slowly on a treadmill for 3 meters while grasping parallel bars. “It was amazing,” Sauer recalls.

    Today, after completing Wernig's 10-week program, in which patients step on treadmills assisted by specially trained therapists and a harness that can support part of their weight, Sauer pushes a wheeled walker around his apartment, stopping to grab books off shelves formerly out of reach. With help, he can even climb a few stairs. And Sauer is not alone. Dozens of other spinal cord-injury patients once confined to wheelchairs can now walk, although in a limited way, thanks to Wernig's program.

    On the move.

    With his weight partially supported by a harness, a spinal cord patient undergoes training on a treadmill.

    ANTON WERNIG/UNIVERSITY OF BONN

    The idea that training can restore some walking ability is buttressed by a growing body of evidence in cats and now in humans. It shows that, contrary to dogma, the adult mammalian spinal cord can perform on its own, largely independent of the brain, many of the functions necessary for walking. What's more, recent data show that neural circuits governing locomotion in the spinal cord can “learn,” by altering their connections, in a way that may help explain some of the improvements Wernig is seeing (see sidebar). “There is a flurry of activity and a pretty upbeat mood. People think that training procedures can enhance the abilities of [those] with spinal cord injuries,” says Keir Pearson, an expert on the neurophysiology of walking at the University of Alberta in Edmonton, Canada.

    More work will be needed to confirm these encouraging, but early, results. Indeed, even supporters caution that no one knows how much improvement individual patients can expect from the treatment. Furthermore, many patients with spinal cord injuries—who number 200,000 in the United States alone—will not benefit from the approach. In particular, notes Sten Grillner, a neurophysiologist at the Karolinska Institute in Stockholm, Sweden, training is unlikely to produce useful walking in patients whose cords are so badly damaged that no connections survive between their brains and the region below their injury. These people would have a lot of trouble voluntarily keeping control of when their legs started or stopped walking. In addition, he says, locomotor training does not restore balance. As a result, quadriplegics like actor Christopher Reeve, who can't use their arms to hold onto walkers and other devices for stability, could not learn to walk without being held upright.

    But if the approach does someday prove successful in larger clinical tests, it might ultimately change the way many individuals with spinal cord injuries are treated. Today, doctors often leave such patients alone, except for therapy to strengthen healthy muscles or maintain flexibility. Current rehabilitative techniques such as helping patients stand or having them cycle their legs in the air have not proved consistently helpful.

    The new work, however, “is saying to a person in a wheelchair: ‘This may not be your lot,’ “says J. Thomas Mortimer, a biomedical engineer at Case Western Reserve University in Cleveland. And even though he concedes that training will not restore normal walking, Mortimer notes that merely being able to walk a few paces and climb a few stairs could vastly improve such a person's life, enabling him to enter a friend's home, a movie theater, or a narrow bathroom that would otherwise be off limits to him.

    Moving to the beat

    The first inklings that the mammalian spinal cord houses the sophisticated neural machinery needed for walking emerged in 1910, when Charles Sherrington, a neurophysiologist at Oxford University in the United Kingdom, found that cats whose spinal cords had been completely cut could perform limited stepping motions. But it was decades before anyone could conclusively pin the engine of locomotion to the spinal cord.

    In 1967, Anders Lundberg and his colleagues at the University of Göteborg in Sweden isolated the spinal cord in adult cats by cutting its link to the brain and also paralyzing all the muscles to deprive the cord of movement-related sensory cues. The researchers then activated the animals' spinal neurons with an injection of L-dopa, a precursor for one of the cord's main neurotransmitters, noradrenaline. They found that the neurons that flex the legs and those that extend them fired in an alternating pattern.

    The researchers concluded that the spinal cord holds a “rhythm generator” for locomotion that beats like the heart and is independent of both sensory cues and the brain. In the 1970s, Grillner and Peter Zangger, then also at the University of Göteborg, confirmed Lundberg's results and extended them. They showed that the cord can produce not only a basic locomotor rhythm but also a more detailed electrical pattern in which different neural signals are sent to different leg muscles.

    While researchers were picking up electrical activity in severed spinal cords, various labs were seeing signs that it might have functional consequences. Grillner's team found, for example, that kittens whose spinal cords had been cut walked well, as could adult cats temporarily, if they were given certain drugs immediately after their cords had been cut. But adult cats with older injuries walked poorly. They needed help placing their paws, balancing, and supporting their weight. And few experts believed such animals could improve. The mature cord was seen as too inflexible to make the subtle adjustments in its wiring required for independent locomotion after an injury.

    But two neurophysiologists, Reggie Edgerton and Serge Rossignol, leading separate teams at the University of California, Los Angeles (UCLA), and the University of Montreal in Canada, weren't so pessimistic. Indeed, both teams showed, in a series of studies in the 1980s, that chronic “spinal cats,” as the animals with severed spinal cords are called, can relearn the locomotor pattern of a normal cat.

    In one study published in 1987, for instance, Rossignol and his former postdoctoral student Hughes Barbeau, now at McGill University in Montreal, demonstrated dramatic improvements in the walking abilities of three spinal cats after two to three sessions a week, during which they were trained to walk with their hindlimbs on a treadmill. The animals at first had to be held up by their tails, but they eventually became able to support their hindquarters while they stepped. They also learned to place their paws sole-first on the treadmill and to take longer, more natural-looking steps. At the same time, mathematical measurements constructed from video images of the cats walking showed that their joint angles and leg movements began to mirror those of intact walking felines. In addition, the cats' leg muscles also began to exhibit more normal patterns of electrical activity.

    Then, in the early 1990s, Edgerton's team members discovered that what an injured spinal cord learns can be surprisingly specific. They compared the walking abilities of three groups of spinal cats: untrained animals, those trained to step, and others trained only to stand. The team showed that the step-trained cats could, after 5 months, walk more naturally and rapidly than the untrained cats. By contrast, the cats that had practiced standing could hardly step at all. This shows, Edgerton says, not only that the cord can learn but that what it learns depends on the exact sensory input it receives. “If you teach a cat to step, it learns to step. If you teach it to stand, it learns to stand, but it can't step,” Edgerton concludes.

    Despite these encouraging results, most experts dismissed the idea that humans with spinal cord injuries might also learn to walk if trained properly. They had never seen it happen before in people, who after all are not cats. One exception was Barbeau at McGill. In a pilot study completed in 1989, his team had trained 10 patients on a treadmill, using a harness that could support up to 40% of their body weight. After 6 weeks of training, the researchers saw significant improvements—both on the treadmill and on the ground—in either the amount of weight a patient could support while walking or in walking speed, depending on whether a patient could walk without support when the study began. By 1990, Wernig's team in Bonn also had results with 12 patients showing that treadmill training had positive effects.

    Small steps for man

    Few researchers paid attention to these early studies because they were small, lacked controls, and were as yet unsupported by other evidence that the human spinal cord contains the neural machinery needed for locomotion. In 1994, however, such evidence emerged when Blair Calancie of the Miami Project to Cure Paralysis published a case study of a man who had partially severed his spinal cord 17 years before.

    A week after the man began an intense physical therapy regimen that included some walking, he reported that his legs suddenly began “walking” one night when he was lying on his back. Because the walking was involuntary, it suggested, Calancie says, that the man's leg movements were not controlled by his brain but had arisen largely in the spinal cord. Indeed, when Calancie and his colleagues measured the electrical activity of the leg muscles, which reflects that of the nerves controlling them, they found that, as in cats, the extensor and flexor nerves were firing alternately with clocklike regularity.

    Since then, several groups have added to the evidence that the human spinal cord can generate steplike electrical patterns when exposed to sensations associated with walking. In 1995, for example, Volker Dietz and his colleagues at the University Hospital Balgrist in Zurich, Switzerland, induced elements of stepping in 10 paraplegics whose spinal cords were completely disconnected from their brains by placing them on moving treadmills with each one's weight supported in a harness. They found that the patterns of leg-muscle activity in these patients were similar to those in healthy subjects during treadmill walking.

    In that same year, Wernig and his team in Germany published the first strong documentation that the spinal cord's walking program can be trained after an injury. In their study, published in the European Journal of Neuroscience, the researchers compared results with partially paralyzed patients whom they trained on the treadmill for 3 to 20 weeks with those of matched controls treated conventionally with other forms of exercise. Of 36 patients with recent cord injuries who were wheelchair-bound at the start of the study, 33 learned to walk independently, at least with the aid of walkers or canes, after treadmill training. By comparison, only 12 of 24 wheelchair-bound controls became independent walkers with conventional therapy. And 25 of another 33 patients with older injuries who had been wheelchair-bound learned to walk independently with Wernig's program, compared to just one of 14 controls.

    Just last year, in the Journal of Neurophysiology, Susan Harkema, Bruce Dobkin, Edgerton, and their UCLA colleagues reported detailed evidence from the human spinal cord that helps explain how a program of exercise walking might help paraplegics. While four patients with complete cord injuries walked with assistance on a treadmill, the researchers recorded the electrical activity in three leg muscles and the instantaneous load on each leg. They did the same in two able-bodied people, who walked unassisted.

    In both sets of subjects, the researchers found that the spinal cord's output, as measured by muscle activity, depended greatly upon the load on the legs. The greater the load, the higher the activity. What's more, the activity was timed to the phase of the step cycle so that it rose just when appropriate to facilitate stepping. The results provide “excellent evidence,” Edgerton says, that the human spinal cord relies on complex sensory information, including load, to orchestrate walking.

    Someday such data will help optimize training regimens by giving researchers a better idea of exactly what kinds of sensory cues the spinal cord needs to govern walking most effectively. But it may take years before most clinicians adopt locomotor training as part of their rehabilitation programs. For one thing, experts schooled in other approaches are not convinced that treadmill walking is better—in part because they may be unfamiliar with the evidence, and in part because no large-scale comparison study has yet been done. “I don't think any of these newer techniques have been proven superior to conventional treatment,” says John Ditunno, a rehabilitation specialist at Thomas Jefferson University Hospital in Philadelphia.

    Even proponents of the new approach urge caution. “A lot more studies need to be done to show that training makes a significant difference,” notes Edgerton. “We're confident that it does, but we need to be careful, because that's an important conclusion.”

    Some of these studies may not be long in coming. Calancie's team at the Miami Project has already begun testing locomotor training with both a treadmill and a ceiling-mounted circular track to which patients are harnessed, allowing them to step forward something like a shirt sliding on a rack at a dry cleaning shop. The track enables somewhat more realistic walking than a treadmill because patients move over the ground instead of having the ground move under them, Calancie says.

    And there may be other, very different improvements on the way. In a paper to appear this month in the Journal of Neurophysiology, Rossignol, Barbeau, and their graduate student Connie Chau showed that the drug clonidine, which they knew helped turn on the cord's locomotor pattern, can speed the recovery of cats from spinal cord transection when combined with locomotor training. “Within a week,” Rossignol says, “the cats are walking with their hindlimbs” without needing more of the drug. Without clonidine, similar recovery in cats takes 3 to 4 weeks. Thus, in the future, patients might be treated with a combination of medication and walking workouts.

    For now, pioneering patients such as the young, good-natured Thorsten Sauer, who once relished the freedom of a motorcycle, bask daily in the small freedom afforded by walking a few paces on their own. “The human body is not built for sitting,” Sauer declares. “Sometimes it should walk.”

  2. NEUROPHYSIOLOGY

    Watching "Walking" Nerves Learn

    1. Ingrid Wickelgren

    Researchers have long known that in the brain, learning goes with subtle changes that strengthen or weaken the connections between neurons. As a result, they've assumed that similar neuronal “plasticity” also underlies the ability of the spinal cord to learn how to control walking after an injury destroys connections to the brain (see main text). But while plasticity had been seen in other motor systems, no one had directly seen it in the spinal cord's walking circuits—until now, that is.

    Working with cats, neurobiologist Keir Pearson of the University of Alberta in Canada and his colleagues Patrick Whelan and Gordon Hiebert have shown that the influence of sensory nerves in the spinal locomotor system can adapt to compensate for an injury. “If you want any motor system that involves reflexes to work precisely, those reflexes must be modifiable. Our work provides the first glimpse of this type of thing in the locomotor system,” says Pearson, whose team's results were published in 1995 and 1997 in the Journal of Neurophysiology.

    The Pearson team members made the discovery serendipitously while studying the way sensations from the leg muscles influence walking in the cat. In 1994, while conducting those experiments, they cut a sensory nerve in a calf muscle called the lateral gastrocnemius (LG). When this nerve is stimulated, the signals it sends to the spinal cord elicit in the leg's motor neurons responses that prolong stance—keeping a walking animal's leg extended and on the ground. This helps calibrate the timing of each step.

    But a few days after the researchers cut the nerve, they noticed that stimulating it didn't keep the leg on the ground as long as usual. At the same time, the cats began walking more normally again. So Whelan and Pearson began to wonder whether some change in the spinal cord might be compensating for the damage to the LG sensory neuron.

    To find out, Pearson's team cut the LG nerves in one hind leg on each of 10 adult cats. Then, after 3 to 28 days, team members stimulated both the LG nerve and a sensory nerve in another calf muscle—the medial gastrocnemius (MG)—while the cats walked on a treadmill. Within 5 days, the researchers found, the ability of the severed LG nerves to prolong stance was much lower than that of the controls, while the stance-prolonging ability of the MG nerves in the injured legs had increased. Thus, the neuronal circuitry had changed to compensate for an injury-induced deficit.

    To prove that the changes had taken place in the cord and not the brain, Pearson and Whelan repeated the experiment in another group of cats, this time cutting the spinal cord in nine of them. With the cord now isolated from the brain in those animals, the researchers found the same decreased influence of the LG nerve in all of the cats that could be evaluated; in some of them, they also saw an increased effectiveness of the MG nerve. This showed that at least part of the plasticity occurred in the cord.

    The mechanism underlying this plasticity is so far unknown. But Pearson speculates that walking may produce “activity-dependent competition,” in which two sensory nerves with similar functions compete for influence in the spinal cord. If one neuron is then cut, its connections to spinal neurons will weaken, allowing the competing nerve to exert greater influence.

    If sensory feedback from walking does influence the strength of spinal connections, this type of plasticity could underlie the improved walking that can be induced in some paraplegics by locomotor training, Pearson suggests. By helping to compensate for a sensory or motor weakness—in this case, from a spinal cord rather than peripheral lesion—it could help a patient improve the rhythm of his or her walk as the cord learns when to bend an extended leg to start the next step.

  3. ASTRONOMY

    Intruder in a Star's Dust Glimpsed

    1. Robert Irion
    1. Robert Irion is a free-lance science writer in Santa Cruz, California.

    WASHINGTON, D.C.—Astronomers may have detected hints of a newborn planet in its dusty nursery around a nearby star. Images from the Hubble Space Telescope reveal pronounced warps in a dramatic disk of dust circling Beta Pictoris, a young star 63 light-years away. Researchers who unveiled the images here at a meeting of the American Astronomical Society last week said the bulges may betray the existence of at least one planet. But others suggested that gravitational nudges from a passing star or a large dim companion, such as a brown dwarf, might account for the distortions.

    Warped by a planet?

    A false-color view of the Beta Pictoris disk extends to a radius smaller than that of Uranus's orbit.

    A. SCHULTZ (CSC/STSCI) AND S. HEAP (GSFC/NASA)

    Astronomers first spotted a dusty veil girdling Beta Pictoris in 1984 in images from IRAS, an infrared satellite. The disk, seen from Earth nearly edge-on, may have provided the ingredients for planet formation. In September 1997, Hubble's new STIS spectrograph blocked the harsh light of Beta Pictoris to probe the disk in great detail—thereby spying features as close to the star as the orbit of Uranus around the sun, some 60% closer than a 1995 Hubble study that spied tentative signs of the warping.

    Based on Hubble's 1995 observation, researchers had speculated that a planet might be perturbing the disk, but others suggested that strong radiation pressure from the star was at work instead. Astronomer Sally Heap of NASA's Goddard Space Flight Center in Greenbelt, Maryland, says that asymmetric bulges seen in the innermost part of the disk in the new image now rule out that scenario and favor a planet, circling Beta Pictoris in an orbit slightly inclined to the plane of the disk. The planet could range in size from 10 times the mass of Earth to 17 times the mass of Jupiter, depending on its distance from the star.

    Astronomer Fred Bruhweiler of the Catholic University of America in Washington, D.C., a member of the observing team, isn't convinced about the planet. “I don't think it's quite that tied down yet,” he says. Bruhweiler points to warpings of the disk at much greater distances from the star—as far out as 20 times Pluto's distance from the sun. Those could not possibly arise from a planet in a tight orbit, he claims. Bruhweiler prefers a disturbance on a much grander scale, from either a star that swung by Beta Pictoris but has long since disappeared or an invisible companion dwarf star.

    Astronomer Sergio Fajardo-Acosta of the University of Denver notes that Heap's and Bruhweiler's deductions are not mutually exclusive. Different phenomena could distort different parts of the disk, he says. But Fajardo-Acosta also urges his colleagues not to rule out unevenly distributed clumps of matter within the disk itself or other sources of asymmetry, such as dense clouds of young comets. Theoretical models of the various proposals should eventually help astronomers decide whether a planet does indeed lurk in the dusty disk of Beta Pictoris.

  4. ASTRONOMY

    Hubble Sees All the Light There Is

    1. Robert Irion
    1. Robert Irion is a free-lance science writer in Santa Cruz, California.

    WASHINGTON, D.C.—The most penetrating view of the cosmos ever, recorded by the Hubble Space Telescope, may have revealed nearly all sources of visible light in the universe. At a meeting of the American Astronomical Society here last week, astronomers announced an analysis suggesting that little if any light shines from just beyond the Hubble Telescope's range of vision. However, many galaxies may be veiled from Hubble's view by far-off dust clouds—an idea supported by other results reported at the meeting.

    Hubble's 2-week-long exposure in 1995, called the Hubble Deep Field, captured a rich panoply of distant galaxies freckling a tiny and seemingly barren patch of the sky. The field's faintest galaxies are billions of times less luminous than what the human eye can see. Even so, astronomers wondered whether a bigger telescope or a longer exposure would reveal countless more glowing islands of visible light in the universe's deep recesses. The discovery of more galaxies shining in the very distant universe, when it was a fraction of its current age, would affect theories that describe when and how the earliest stars and galaxies took shape after the big bang.

    Baring all.

    This Hubble image, shown in false color, captures nearly all visible galaxies.

    M. S. VOGELEY/PRINCETON UNIVERSITY OBSERVATORY

    To address those doubts, astronomer Michael Vogeley of the Princeton University Observatory explored the blank spaces within part of the Hubble Deep Field. If fainter galaxies exist in great numbers, Vogeley reasoned, they should create subtle but detectable “ripples” of visible light between the brighter galaxies. But his rigorous statistical analysis uncovered no such ripples. “An infinitely long exposure with Hubble would find, at most, only a few percent more light at optical wavelengths,” Vogeley says.

    Still, many other galaxies may be hiding behind veils of dust, which sops up their light and converts it to infrared radiation. Astronomers also announced at the meeting that a NASA satellite called the Cosmic Background Explorer found a surprisingly bright infrared glow across the entire sky—evidence that dust hides more distant galaxies than previously believed (Science, 9 January, p. 165). “We thought the Hubble Deep Field had showed us most of the objects that emit light in the universe,” says cosmologist David Spergel of Princeton University. “But nature is telling us, ‘No, it hasn't.’”

    Nevertheless, theorists believe Vogeley's work will help refine current models of galaxy evolution—at least until future telescopes can census the additional dust-shrouded galaxies. As astrophysicist Neil deGrasse Tyson of the American Museum of Natural History in New York City put it: “We now have added confidence that few galaxies are missing when we look to the sky with our most powerful optical telescopes.”

  5. CANCER RESEARCH

    Peptide-Guided Cancer Drugs Show Promise in Mice

    1. Marcia Barinaga

    As any cancer patient who has endured chemotherapy knows, most regimens walk a fine line between killing the tumor and killing the patient. That's because chemotherapeutic drugs spread throughout the body, reaching not only the tumor but also healthy organs such as the gut and bone marrow, where they kill off normal dividing cells. To make matters even worse, tumor cells are also quick to mutate and become resistant to the drugs. Now, on page 377, a team led by Erkki Ruoslahti at The Burnham Institute in La Jolla, California, reveals a strategy that may get around both problems.

    Ruoslahti and his colleagues have devised a way to target cancer drugs to the new blood vessels that nourish the tumor. They found small peptides that zero in on the cells lining these newly formed blood vessels, then linked the peptides to the chemotherapeutic drug doxorubicin. By addressing the toxic drug specifically to the tumor, the strategy spares other tissues. And because the tumor vessel cells are not cancerous themselves, they are much less likely to develop resistance to the drugs than are the highly mutable cancer cells. Indeed, when the researchers gave the peptide-drug combination to mice with large tumors, it killed off the blood vessels, stopped tumor growth, and allowed the mice to survive the cancer.

    Zeroing in.

    In standard chemotherapy (left), drugs (orange) flood both tumors and normal tissues. Drugs targeted to tumor blood vessels (right) spare normal tissues.

    ADVANCED MEDICAL GRAPHICS

    Other researchers, such as Judah Folkman at Harvard Medical School, have shown that inhibiting angiogenesis, as new blood-vessel growth is called, can block tumor growth in animals. Indeed, such work has made angiogenesis a hot research area in recent years (Science, 24 January 1997, p. 482). But progress on that approach has been slowed because the best angiogenesis inhibitors are proteins that are expensive and laborious to produce in bacteria. Ruoslahti's method lacks that disadvantage. “In theory, this is immediately translatable into the clinic,” says tumor biologist Bob Kerbel, of the Sunnybrook Health Science Center at the University of Toronto, “because you are dealing with a drug that is already available and clinically approved. And it would not be difficult to produce these peptides.”

    What's more, the technique used by the Ruoslahti team to identify peptides that home in on tumor vessels can identify peptides that bind specifically to the blood vessels of other organs. This means that peptides could be developed to carry drugs to many different tissues to treat conditions other than cancer. “Ruoslahti can address drugs wherever he wants to in a nontoxic way,” enthuses Folkman. “A few years from now, this will be the basis of a new pharmacology.”

    The idea of ferrying chemotherapeutic drugs to tumors on the backs of other molecules isn't new. But large, ungainly proteins such as antibodies have generally been used as the vehicles, with mixed success. To find small peptides that could deliver drugs specifically to tumors, Ruoslahti and then-postdoc Renata Pasqualini turned to a technique developed in the late 1980s by George Smith at the University of Missouri in Columbia. Called phage display, it involves engineering bacterial viruses called phages so that each displays a different random peptide on its surface. The technique had been used to find peptides that stick to particular proteins, by exposing a mix of peptide-displaying phages to a surface coated with the protein. Ruoslahti reasoned that if the phages were injected into an animal, the technique might be able to identify peptides that stick to specific tissues.

    Pasqualini made the idea work. Two years ago, she and Ruoslahti reported in Nature that they had identified peptides that home in on the blood vessels of the kidneys or brains of mice. “We practiced first with normal organs,” says Ruoslahti, “although, all along, our aim was to look at tumors.” And in this issue, the team reports that when they tried the scheme on mice with tumors, they found several peptides that stick to molecules found only in tumor-associated blood vessels.

    One of the peptides the team identified binds to αvβ3 integrin, a cell-adhesion protein that David Cheresh's group at The Scripps Research Institute in La Jolla had already shown to be concentrated in angiogenic blood vessels. But the group also identified “a whole panel of other peptides,” Ruoslahti says, that bind to as-yet-unidentified molecules specific to angiogenic vessels. Not only do those peptides represent alternate means of conveying drugs, but they also will be useful probes for studying the proteins they target.

    Next, the team chose two of the peptides and hooked them individually to the anticancer drug doxorubicin, to see if they would guide the drug to tumors and kill them. They did. When given to mice that had large tumors derived from human breast cancer cells, even tiny amounts of the peptide-linked drug were better at stunting tumor growth than was free doxorubicin, which was hampered by its toxicity.

    Indeed, some of the mice treated with the doxorubicin-peptide conjugate lived for 6 months after the treatment, while those treated with doxorubicin alone died either of tumors or of drug poisoning at the high doses. “We were never able to find a concentration of free doxorubicin that was anywhere near as effective as our conjugate,” Ruoslahti says. The tumors don't disappear completely, he notes, but what remains seems to be inactive scar tissue. “The mice live very long, so it doesn't seem to bother them.”

    Ruoslahti thinks that the drug conjugates act primarily by killing blood vessels that feed the tumor, although the drug may also diffuse into the tumors and kill cells directly. “The technique probably targets both the vessels and the tumor cells. That is its big advantage,” says tumor-cell biologist Bruce Zetter of Harvard Medical School. “It gives you a kind of double-pronged therapeutic effect that should be quite powerful.” And, of course, a similar double-punch could be achieved with other drugs. “There may be things we haven't used because of their high toxicity that could be used in a more directed way,” says Zetter.

    Ruoslahti's isn't the only team to kill tumors by directing drugs to their blood vessels. Early last year, Philip Thorpe's group at the University of Texas Southwestern Medical Center in Dallas reported that a blood-clotting factor targeted to tumor blood vessels with an antibody caused massive clotting and killed the tumors. And in November, a University of Minnesota team led by S. Ramakrishnan reported in the International Journal of Cancer that it had slowed tumor growth in mice by targeting diphtheria toxin to tumor blood vessels using vascular endothelial growth factor, a protein that binds to a receptor that is plentiful in new vessels.

    Thorpe notes that his work relied on experimentally engineered tumor cells, so it “didn't directly extrapolate to humans” as Ruoslahti's does. And many in the field favor the latter approach over others that use proteins, because the small peptides are easy to make and use and the technique can be tailored to many different tissues.

    The path to clinical trials of the peptide-doxorubicin conjugate looks fairly clear, but oddsmakers know only too well that the favorite out of the gate will not necessarily be the first across the finish line. “We are in very early stages of anti-angiogenesis therapy,” says Zetter. “We have to test them all, and go after the best.” Where peptide conjugates finish will be apparent only after the race is run.

  6. INFECTIOUS DISEASE

    Sequence Offers Clues to Deadly Flu

    1. Gretchen Vogel

    The Hong Kong “bird flu” that has killed four people, sickened more than a dozen, and prompted the mass slaughter of more than 1.5 million chickens in the last month is still perplexing to scientists. Researchers are trying to discover why this virus is so deadly and why, unlike most known avian viruses, it can infect human cells. Now, in a report on page 393, a team from the United States and Hong Kong provides the most careful look yet at the virus—a complete sequence of the genes that code for its surface proteins and partial sequences of the remaining genome. Although the sequence so far can't reveal all of the virus's biological tricks, it offers clues as to how the virus infects cells, and it lays the groundwork for understanding what makes the bird flu a killer.

    This particular virus was isolated from a 3-year-old boy in Hong Kong, who died in May after coming down with a flulike disease that did not match any of the known human influenza strains (Science, 12 September 1997, p. 1600). It did, however, match a bird strain, called H5N1 because of the varieties of the proteins hemagglutinin and neuraminidase on its surface. H5N1 had infected and killed thousands of chickens in Hong Kong a few months earlier, but no one expected it to jump to humans.

    To infect cells, viruses must attach to specific binding sites on the cell membrane, and human and bird sites are different enough that researchers assumed a single flu virus could not infect both species. Avian strains generally have to mix with human flu viruses in an intermediate host, such as pigs, to produce a new variety dangerous to humans. When H5N1 broke this rule, it triggered a public health alarm. Because people had never before been infected by the bird strain and therefore have no immunity to it, epidemiologists worried that the strain could trigger a pandemic.

    By analyzing the DNA sequence of the virus, researchers led by Kanta Subbarao of the U.S. Centers for Disease Control and Prevention in Atlanta have now confirmed experts' first hunch: The virus is indeed derived from an avian influenza strain, evidently without an intermediate host. This is probably not the first time such a leap has happened, but it's the first time scientists have been able to observe it directly, says Subbarao.

    The team members also have uncovered a possible clue to what makes the strain so deadly to both birds and people. When they sequenced the gene for hemagglutinin, they found an insertion that is common among especially virulent bird viruses but had never been isolated from a human. The insert codes for several additional amino acids right next to a crucial spot where cellular enzymes cleave the hemagglutinin protein. That cleavage helps the protein coat break apart, allowing the virus to infect cells.

    Scientists suspect that the cleavage site is key to a virus's infectivity. The enzymes that cleave the most common protein are abundant in the digestive and respiratory systems of birds, and most flu strains can infect only those cells, says team member Michael Perdue of the U.S. Department of Agriculture's (USDA's) Southeast Poultry Research Laboratory in Athens, Georgia. But the extra amino acids may provide an easier—and less specific—target for enzymes, allowing the virus to infect other tissues, including heart, brain, and blood vessels.

    Virologist Robert Webster of St. Jude Children's Research Hospital in Memphis, Tennessee, suspects that the insert “allows the virus to become systemic. Instead of just replicating in the respiratory tract, it now can spread through the bloodstream.” Poultry victims of the virus suffered general hemorrhaging and death within a few days. It is still not clear whether the virus works the same way in humans, however. None of the human victims hemorrhaged, although several had suspicious kidney failure, says Webster.

    The virus isolated from the boy is deadly to chickens, however. The Subbarao team experimentally infected 24 chickens with it, and all but one died. That raises concerns for the USDA, says Perdue. If a human were to carry the flu back from Hong Kong, it could be devastating to U.S. poultry.

    What the scientists still don't know is exactly how this flu strain manages to infect humans. To solve that question, Subbarao says, researchers are closely examining a range of avian flu viruses, hoping to pinpoint how this H5N1 strain is different. Webster says he and his colleagues have uncovered one potential clue. In work in press at The Lancet, he and his colleagues report that the hemagglutinin of viruses isolated from Hong Kong chickens in March contains a carbohydrate near the site where it binds to cell surfaces, but that molecule is missing from the H5N1 strain isolated from the boy. Webster says the change “may have great influence” on the virus's ability to bind to human cells.

    However the virus has altered to allow bird-to-human infection, it still doesn't pass easily between humans. So far, there has been only one suspected case of transfer from one person to another: The toddler's doctor has antibodies to the virus but never got sick. And for now, there has been a drop in new infections—the latest individual became ill on 28 December. But epidemiologists are keeping a wary eye on Hong Kong, especially as the yearly flu season begins. Although no new cases have been reported, officials fear that a currently circulating version could mix with a strain more adept at infecting humans, sparking a pandemic.

    The chances of that pandemic are greatly reduced, say most researchers, now that Hong Kong's millions of chickens in open-air markets have been killed. “The slaughter was absolutely essential,” says Webster. “The big question is whether the stable door was shut in time.”

  7. ARCHAEOLOGY

    Sea-Floor Dust Shows Drought Felled Akkadian Empire

    1. Richard A. Kerr

    SAN FRANCISCO—When civilizations collapse, the blame is often laid on the culture itself—leaders who overreached, armies that faltered, farmers who degraded the land. Such were the conventional explanations for the end of the world's first empire, forged by the Akkadians by 2300 B.C. Their reign stretched 1300 kilometers from the Persian Gulf in present-day Iraq to the headwaters of the Euphrates River in Turkey. They were the first to subsume independent societies into a single state, but the Akkadian empire splintered a century later, not to be reunited in such grandeur for 1000 years.

    In 1993, however, archaeologist Harvey Weiss of Yale University proposed that the Akkadians were not to blame for their fate. Instead, he argued that they were brought low by a wide-ranging, centuries-long drought (Science, 20 August 1993, p. 985) that toppled other civilizations too, including those of early Greece, the pyramid builders in Egypt, and the Indus Valley in Pakistan. Many archaeologists were skeptical because the timing of these collapses was imprecise, and purely social and political explanations seemed to suffice. But now Weiss's theory, at least as applied to the Akkadians, is getting new support from a completely independent source: an accurately dated, continuous climate record from the Gulf of Oman, 1800 kilometers from the heart of the Akkadian empire.

    Left in the dust.

    Clues such as grain storage vessels and dust from a core show that drought did in the Akkadians in 2200 B.C.

    H. WEISS/YALE UNIVERSITY

    At the annual fall meeting last month of the American Geophysical Union here, paleoceanographers Heidi Cullen and Peter deMenocal of Lamont-Doherty Earth Observatory in Palisades, New York, and their colleagues reported that a sediment core retrieved from the bottom of the gulf matches Weiss's version of events: The worst dry spell of the past 10,000 years began just as the Akkadians' northern stronghold of Tell Leilan was being abandoned, and the drought lasted a devastating 300 years. The new results illustrate, says Weiss, that climate change “is emerging as a new and powerful causal agent” in the evolution of civilization.

    Some archaeologists aren't willing to accept that the same drought changed history across the Old World, however. That argument “just doesn't float,” says archaeologist Carl Lamberg-Karlovsky of Harvard University. But he and others agree that the new marine record lends support to the climate-culture connection that Weiss identified at the ruined city of Tell Leilan in the northern part of Mesopotamia, a region that includes parts of present-day Syria, Iraq, and Turkey. Weiss began excavations there, on the Habur Plains of northeast Syria, in 1978.

    Tell Leilan was a major city covering 200 acres by the middle of the third millennium B.C., and its people thrived on the harvests of the plains' fertile fields. But, unlike the farmers of Sumer in southern Mesopotamia, who used irrigation from the Euphrates and Tigris rivers to ensure bountiful harvests, the farmers of Tell Leilan depended on plentiful rainfall to water their fields. Less than a century after the people of Akkad in central Mesopotamia extended their reach into the north, those rains began to fail, says Weiss.

    When Weiss and Marie-Agnès Courty, a soil scientist and archaeologist at the National Center for Scientific Research in Paris, dug through the accumulated debris of Tell Leilan, they encountered an interval devoid of signs of human activity, containing only the clay of deteriorating bricks. The abandonment began about 2200 B.C., as determined by carbon-14 dating of cereal grains. Soil samples from that time showed abundant fine, windblown dust and few signs of earthworm activity or the once-abundant rainfall. All this suggested that the people of Tell Leilan, and, presumably, its environs, retreated in the face of a suddenly dry and windy environment, triggering the collapse of the Akkadian empire's northern provinces. Only after the signs of dryness abated, about 300 years later, was Tell Leilan reoccupied.

    Weiss went further, however, proposing that refugees from the drought went south, where irrigation helped protect crops. Droves of immigrants would have further strained a sociopolitical system already stressed by the same drought, he says, until the whole system collapsed under the strain. And he noted that the pyramid-building Old Kingdom of Egypt, the towns of Palestine, and the cities of the Indus Valley went into precipitous declines at about the same time and apparently also suffered unstable climates.

    It's a neat story, but critics questioned whether the drying really was catastrophic enough to bring down all of Mesopotamian civilization, where irrigation would have helped farmers cope with the drought. And they were even more skeptical that such a drought could have felled other cultures across the Old World. To test these ideas, deMenocal and Cullen decided to see just how big and bad the drought really was. They analyzed sediment from the Gulf of Oman, reasoning that if all of Mesopotamia had become a dust bowl, the hot northwest summer wind called the Shamal would have blown that dust down the Tigris and Euphrates valley, over the Persian Gulf, and finally into the Gulf of Oman, 2200 kilometers from Tell Leilan.

    Cullen and deMenocal looked for this far-traveled dust in a 2-meter sediment core spanning the past 14,000 years, which was retrieved from the Gulf of Oman by paleoceanographer Frank Sirocko of the University of Kiel in Germany. In samples taken every 2 centimeters along the core, they measured the amounts of dolomite, quartz, and calcite—minerals that today dominate the dust blown from Mesopotamia by the Shamal. They found that wind-blown dust levels in the Gulf of Oman were high during the last ice age until about 11,000 years ago, then settled down to levels more typical of today. But in the sample from 2000 B.C. plus or minus 100 years, as dated by carbon-14, the abundance of dust minerals jumped to two to six times above background, reaching levels not found at any other time in the past 10,000 years.

    The extreme dustiness—which suggests a wide-ranging area of dryness—persisted through the next sample 140 years later but faded away by the third sample, indicating a duration of a few hundred years. The team also tracked isotopes of strontium and neodymium, which occur in different ratios in dust from different regions. They confirmed that during the dust pulse, the proportion of minerals with a composition similar to that of the soils of Mesopotamia and Arabia increased.

    Given the uncertainties of carbon dating, the marine dust pulse and the abandonment of Tell Leilan could still have been several centuries apart. But Cullen and deMenocal found in the core another time marker that makes a somewhat tighter connection. Less than about 140 years before the dust pulse is a layer containing volcanic ash. And Weiss had already reported that a centimeter-thick ash layer lies just beneath the onset of aridity and abandonment at Tell Leilan. The strikingly similar elemental compositions of the two ashes imply that they stem from the same volcanic event. If so, then Tell Leilan was abandoned just after the start of a climatic change of considerable magnitude, geographical extent, and duration. “There's something going on, a shift of atmospheric circulation patterns over a fairly large region,” says Cullen.

    Some archaeologists agree that this climate shift did change history outside northern Mesopotamia. “Most people who work in this range of time don't pay much attention to climate,” says archaeologist Frank Hole of Yale; “rather, it's political and social events [that matter]. … But I think the evidence is overwhelming that we've got something going on here.”

    While conceding that climate and culture interact, a number of archaeologists still think that Weiss is pushing the connection too far. Drought may well have driven people from farmland dependent on rainfall, like that around Tell Leilan, says Lamberg-Karlovsky, but Weiss “generalizes from his northern Mesopotamia scenario to a global problem. That's utterly wrong. … Archaeologists fall in love with their archaeological sites, and they generalize [unjustifiably] to a larger perspective.”

    Even in Mesopotamia, “you do not have by any means a universal collapse of cultural complexity,” says Lamberg-Karlovsky. For example, at 2100 B.C., in the midst of the drying, the highly literate Ur III culture centered in far southern Mesopotamia was at its peak, he says, as was the Indus River civilization to the east, which thrived for another 200 years. Weiss counters that cuneiform records show that Ur III did in fact collapse 50 years later, apparently under the weight of a swelling immigrant population and crop failures. That timing still fails to impress Lamberg-Karlovsky, who concludes that Weiss is “getting little support for the global aspect of it.”

    Such support may yet come from climate records being retrieved from around the world. In an enticing look at the postglacial climate of North America, Walter Dean of the U.S. Geological Survey in Denver found three sharp peaks in the amount of dust that settled to the bottom of Elk Lake in Minnesota. Dust peaked at about 5800, 3800, and 2100 B.C., plus or minus 200 years, according to the counting of annual layers in the lake sediment. During the 2100 B.C. dust pulse, which lasted about a century, the lake received three times more dust each year than it did during the infamous Dust Bowl period in the U.S. in the 1930s. But the archaeological record doesn't reveal how this drought affected early North Americans, who at that time maintained no major population centers.

    In another sign that the Mesopotamian drought was global, Lonnie G. Thompson of Ohio State University and his colleagues found a dust spike preserved in a Peruvian mountain glacier that marks “a major drought” in the Amazon Basin about 2200 B.C., give or take 200 years. It is by far the largest such event of the past 17,000 years. But it doesn't seem to have had entirely negative effects; indeed, it roughly coincides with a shift in population centers from coastal areas of Peru, where the ocean provided subsistence, to higher regions, where agriculture became important. As more such records accumulate in the rapidly accelerating study of recent climate, archaeologists will have a better idea of just how much history can be laid at the feet of climate change.

  8. MATERIALS SCIENCE

    Getting a 3D View of Surfaces

    1. Gary Taubes

    The true character of a material is often just skin deep. The arrangement of the first two or three layers of atoms can be enough to determine key properties such as friction, hardness, and chemical reactivity. Learning how those atoms are organized has been notoriously difficult, however. Scanning tunneling microscopes look at only the top layer of atoms, while x-rays pass right through to deeper layers. But imaging surfaces is about to get considerably easier and faster.

    For years, researchers solved the atomic structure of a surface by sending in a beam of low-energy electrons, collecting the diffraction pattern generated when the electrons scatter off the surface, and comparing that to model calculations. But without an approximate picture of the surface structure to start with, low-energy electron diffraction, or LEED, was little help. Now, a team of surface physicists from the University of Erlangen-Nürnberg in Erlangen, Germany, has found a way to turn LEED data directly into three-dimensional images of the atomic arrangements.

    Surface impression.

    An electron diffraction pattern (left) from a silicon carbide surface yields the three-dimensional arrangement of atoms, which matches a model (right).

    REUTER ET AL., PRL

    “When we deal with complex materials, and we want to know more about the surface structure, or we want to tailor-make some structures for certain applications, then this will help us do it,” says Ulrich Starke, a member of the Erlangen-Nürnberg team, which describes its work in the 15 December 1997 Physical Review Letters (PRL). “Anything for which a fundamental understanding of surface crystallography is important should benefit from this technology,” adds Dilano Saldin, a solid-state theorist at the University of Wisconsin, Milwaukee.

    The LEED technique dates back to the 1920s, although it wasn't until the late 1960s that researchers developed the algorithms needed to determine the atomic structure of a surface from the data. Since then, LEED has spread to virtually every surface-science laboratory in the world, in part because it is so simple and inexpensive. An electron gun, like a cathode ray tube in your television set, is aimed at a surface; the electrons bounce off the first few atomic layers, interfere with one another like light waves, and form a diffraction pattern on a luminescent screen. From the intensities and positions of the spots in the pattern, an algorithm recreates the arrangement of the surface atoms.

    But it can only do so by iteration, starting with an approximate model of the surface in question. “You give this model to the computer and make the computer calculate the intensities,” says Klaus Heinz, head of the Erlangen-Nürnberg team. “If those calculated intensities coincide with those you measure, you're pretty sure you have the correct structure. But normally, you have to modify your model through trial and error. And in cases of complex structures, it may not ever be possible to find the correct structure.”

    In 1990, however, Saldin and Pedro de Andres of the Universidad Autonoma de Madrid suggested that the diffuse diffraction pattern generated when electrons scatter from a disordered surface might be interpreted as a hologram—a three-dimensional image captured in an interference pattern. Holograms are created when a beam scattered off an object interferes with a reference beam that has bypassed that object. Saldin and de Andres realized that they might be able to generate such a pattern if the surface has a single atom sticking out prominently, which can split the incoming electron beam. Half the beam bounces off the prominent atom and back to the detector, creating the reference beam, explains Heinz. The other half goes on to the other surface atoms and then scatters back to the detector, producing an object beam. The two interfere to produce the holographic image, which can be extracted from the LEED data.

    Heinz's group collaborated with Saldin and his colleagues to produce atomic-resolution holograms of disordered oxygen and potassium atoms on a nickel surface. But measuring the intensity of a diffuse diffraction pattern accurately enough to extract a hologram was extremely difficult; only a few labs in the world could do it. Surface scientists also tend to be much more interested in ordered surfaces than in disordered ones because, says Starke, “those are the ones you can reproducibly prepare.”

    Heinz and his colleagues then realized that, with some modification, the same holographic reconstruction algorithms could work on the much brighter diffraction patterns that result from crystalline surfaces. The result is the pioneering image in the PRL paper: an electron hologram of a well-ordered atomic surface—in this case, silicon carbide.

    Not every surface will give up its secrets to electron diffraction holography. It only works on materials with the occasional prominent atom, although such atoms can be chemically attached to a surface of interest before it is imaged. What's more, the technique can achieve a resolution of only 0.5 angstrom, compared to 0.1 angstrom for conventional LEED. For now, says Saldin, the technique is best suited to providing a quick first guess to plug into the LEED algorithm, which can then generate the correct, well-resolved structure. “It is a direct method of very quickly getting an approximate view of the structure,” he says—a quick, surface impression of a material.

  9. AIDS RESEARCH

    Chemokine Mutation Slows Progression

    1. Michael Balter

    One of the most insidious features of the AIDS virus, HIV, is its habit of lurking in the body for years before causing overt disease. Why HIV takes so long to destroy the immune system of most infected patients is a central question in AIDS research. Now, findings presented in this issue may provide fresh clues to the mystery—as well as suggest new therapies that could slow or stop progression of the disease.

    On page 389, geneticists Stephen O'Brien, Cheryl Winkler, and their colleagues at the National Cancer Institute in Frederick, Maryland, together with collaborators from other institutions in the United States and Japan, report that HIV-infected patients who have a mutant gene for a molecule called SDF-1 progress much more slowly to full-blown AIDS or death than do people with a normal version of the gene.

    SDF-1 is a member of a class of molecules called chemokines, which help recruit immune cells to sites of inflammation. It normally binds to a receptor molecule on T lymphocytes—the cells targeted by HIV—called CXCR4, which the virus also uses to enter T cells during later stages of the disease. The results suggest that the mutant gene, called SDF1-3'A, helps protect infected people from the ravages of these late-stage viruses. These findings mark the first time that a mutation in a gene coding for a chemokine, rather than a chemokine receptor, has been shown to affect the course of HIV infection.

    Protector?

    SDF-1 chemokine may tie up T cell receptors.

    M. CRUMP/PENCE

    O'Brien's team found the mutation during a genetic screen of blood samples taken from 2419 HIV-infected patients in study cohorts across the United States, as well as 435 people who had been exposed to HIV but remained uninfected. In earlier work on these patients (Science, 27 September 1996, p. 1856), the team found that people with two mutant copies of the gene coding for CCR5—a chemokine receptor targeted by HIV in the earlier stages of infection—are highly resistant to infection. The new study indicates that subjects who are homozygous for the SDF1-3'A mutation—meaning they carry two copies of it—are also protected, but primarily against progression of the disease after infection. Disease progression in homozygous Caucasians, for example, takes three times longer than in similar individuals who possess only one mutant copy or none at all.

    The SDF1-3'A variation, which occurred in homozygous form in less than 5% of the patients studied, is located in a part of the gene that is not “translated” into the building blocks of SDF-1. Instead, it is in an adjacent, untranslated portion whose sequence is conserved among mammalian species, indicating that it may have an important regulatory function. O'Brien and his colleagues suggest that this segment may control the production or transport of the chemokine. If so, the mutation may protect infected individuals by increasing the production or availability of SDF-1, which would bind to CXCR4 and block the virus from entering the T cells.

    The study provides no direct evidence for this idea, but other researchers told Science that it is a reasonable—and attractive—interpretation of the results. Viral immunologist Jean-Louis Virelizier of the Pasteur Institute in Paris says that an increased level of the chemokine is the “simplest explanation” for the findings. Although SDF-1 has previously been shown to block CXCR4-using viruses in the test tube, Virelizier says the new results “provide the first evidence” that the chemokine “may participate in vivo in the host's defense against HIV infection.” And Dan Littman, an AIDS researcher at the New York University Medical Center, says that if the mutation does affect SDF-1 levels, “it would be very exciting indeed, because it would suggest that progression to disease is in large part dependent on HIV interacting with CXCR4.” That is a popular, but not yet proven, hypothesis for how HIV becomes increasingly lethal to its target cells.

    The next step, researchers say, will be to prove that the mutation really does increase SDF-1 levels. If O'Brien's hypothesis is right, the findings could point the way to development of new anti-HIV drugs: SDF-1 or a laboratory-modified version of the molecule “may have antiviral effects even at late stages of HIV infection,” says Virelizier. Indeed, O'Brien argues that the protective effects of the SDF1-3'A mutation could be considered a first clinical test of that appealing possibility. “This was not done in the test tube, but with patients infected with HIV,” he says. “It has parallels with a clinical trial. The results are very provocative.”

Log in to view full text