News this Week

Science  29 Oct 1999:
Vol. 286, Issue 5441, pp. 876

    Siberian Mammoth Find Raises Hopes, Questions

    1. Richard Stone

    In the first-ever operation of its kind, a team working in Siberia has excavated a huge chunk of permanently frozen sediment containing what it hopes are the remains of a woolly mammoth that died 20,000 years ago. On 17 October the crew airlifted the 22-ton block of tundra to a cavern hewed from the ice on Russia's Taimyr Peninsula, where scientists plan to thaw the block next spring to study what's left of the extinct beast inside—and perhaps even mount an effort to clone it. Expectations are high, but one Russian expert involved in the made-for-TV expedition is pessimistic: He contends that the find's significance has been exaggerated, and the team may end up with little more than 22 tons of dirt and meltwater.

    If a carcass is indeed hidden in the ice, it won't be the first time the woolly mammoth, which made its last stand on an island in the Arctic Ocean about 3800 years ago, has seen the light of modern day. Several intact carcasses have been unearthed in the past 2 centuries—including a now-famous baby mammoth, nicknamed Dima, found in 1977—but some were already dried out and others quickly spoiled upon exposure to above-freezing temperatures. The latest find, claims paleontologist Larry Agenbroad of Northern Arizona University in Flagstaff, “is a first to bring an entire frozen organism out of the tundra complete.”

    The story began 2 years ago, when a family of nomadic reindeer herders dug up two huge mammoth tusks, each more than 2 meters long, protruding from the permafrost. The tale reached the ears of Bernard Buigues, an explorer with a Paris-based tourist firm called Polar Circles Expeditions, which uses the Taimyr capital, Khatanga, as a base for air and ski adventures to the North Pole. The native Dolgans led Buigues to the site near the Bolshaya Balakhnya River last year, where he recovered portions of the fractured mammoth skull, including pieces of hair, skin, and bone later used by researchers at the University of Utrecht in the Netherlands to date the remains. Buigues then organized a $2 million expedition—funded largely by the Discovery Channel, which is preparing a documentary on the Siberian adventure to air in March—to excavate the presumed carcass, called the Jarkov mammoth after the family that found the tusks.

    Starting work last month in sometimes bitterly cold, windy conditions, Dolgan workers used jackhammers to carve around and under the 3-by-2-meter permafrost block thought to contain the mammoth, then slid an iron support beneath it. They prepared to haul it out of the ground by helicopter. At first the permafrost was going nowhere. “It seemed impossible to lift,” Buigues told reporters last week. At last the block, with the two original tusks stuck into it to improve the TV images, rose up into a clear blue sky. It was then flown more than 300 kilometers to Khatanga for storage in an ice cave with a constant temperature of about 15 degrees Celsius below zero.

    Starting next spring, scientists plan to use blow dryers to thaw the block and find out how much of the mammoth—known from its tusks to have been a male and from its teeth to have been about 47 years old—might be inside. “We have not yet seen any of the flesh or organs or so on,” says Agenbroad, but ground-penetrating radar readings by Buigues indicate that substantial portions of the mammoth may remain. Not covered by glaciers during the last Ice Age, the Bolshaya Balakhnya region is a well-known graveyard for mammoths, woolly rhinos, and other extinct creatures, says Andrei Sher of the Institute of Ecology and Evolution in Moscow. If the radar does detect remains, says Sher, it will be a “powerful new tool that we could only dream about before.”

    But Alexei Tikhonov, a mammoth expert at the Institute of Zoology in St. Petersburg, Russia, says he doubts the block contains much of the beast. Tikhonov, who spent a week at the site last month, says only the top 80 centimeters of permafrost in the excavated block consists of sandy earth that's likely to preserve tissues; the remainder is an ice wedge that, he argues—based on 15 years of fieldwork on the tundra—is unlikely to contain remains, as it is a crack that widened slowly as it filled with snow and water. Moreover, the few bones found at the site are “very clean,” with no signs of attached muscle. “All of this suggests that the find is not very important, not interesting for science,” Tikhonov told Science. However, Tikhonov was not on site when the team worked around the clock for 3 days to thaw the top few centimeters of the block for the sake of the TV crew, says Dick Mol, a highly respected amateur Dutch paleontologist. “I was against this idea,” says Mol, but the dryers did reveal lush gray and yellow hair that appeared to be rooted in flesh. “For me this is evidence there must be much more.”

    If Tikhonov is proved wrong, a nonprofit organization called Mammuthus, founded by Buigues, will oversee scientific studies of the remains in a cold lab open to researchers. One tantalizing long shot would be well-preserved sperm or other cells containing viable DNA that would allow scientists to attempt to resurrect the species. They could inject sperm into the egg of a close relative—the Asian elephant—to try to create a hybrid, or perhaps even attempt to clone a pure mammoth by fusing the nucleus of a mammoth cell with an elephant egg cell stripped of its DNA. According to Agenbroad, a reputable U.S. lab experienced in elephant breeding has already expressed interest in teaming up with Mammuthus. They aren't the only ones interested in resurrecting a mammoth: A Japanese team hopes to pull off a similar feat with mammoth remains it unearthed this summer in another region of Siberia.

    Even if the Jarkov mammoth yields subpar tissue, scientists expect to learn more about the mammoth's habitat from plant material found underneath the permafrost block. And by keeping the remains at subzero temperatures for as long as possible, scientists hope to extract pathogens that could have contributed to its death. “People will speak of this discovery 100 years in the future,” predicts Mol. Provided, that is, something emerges from the ice when the dryers are turned off.


    The Why Behind the Y Chromosome

    1. Gretchen Vogel

    The human Y chromosome may be best known as a champion testosterone booster, but its functional powers are puny compared to those of its partner, the X chromosome: It is only one-third the size of the X and has only 1/100th as many genes. Despite this mismatch, scientists have long suspected that the X and Y were once equals, but they gradually diverged over time. Now, on page 964, two researchers report evidence for how this split occurred.

    In a kind of molecular-scale fossil dig, geneticists David Page of the Whitehead Institute for Biomedical Research at the Massachusetts Institute of Technology and Bruce Lahn, now at the University of Chicago, analyzed genetic “fragments of history”—genes still found on both chromosomes that have remained relatively unchanged for millennia. They used these genetic relics to piece together a rough history of how the chromosomes drifted apart. The distinctions between X and Y didn't happen gradually, they concluded, but in a stepwise fashion, implying that at least four distinct events—most likely rearrangements of the Y chromosome—drove the chromosomes apart over hundreds of millions of years.

    “It's fascinating work,” says geneticist Huntington Willard of Case Western Reserve University School of Medicine in Cleveland. “It gives us an intriguing glimpse” into the evolution of the human sex chromosomes. The work may also shed light on the evolution of the sex chromosomes of birds and insects, which developed independently of the mammalian system.

    The mismatch between the X and Y chromosomes creates some unusual biology. During the specialized cell division that creates sperm and eggs, most chromosome pairs are able to line up and swap pieces, a process called recombination. Like two friends who keep in touch despite being separated by long distances, this occasional exchange keeps the pairs up to date with each other. It also creates beneficial combinations of genes that can spread throughout the population. But recombination won't work if the pairs are a poor match, and in humans, the X and Y chromosomes recombine only at their tips.

    Although X and Y look very different, in recent years geneticists have turned up at least 19 genes that are present on both—all of them leftovers from the days when the chromosomes were kept similar by recombination. Lahn and Page scored each gene pair for sequence similarity, focusing on the number of “synonymous” gene differences between them—changes in DNA that don't change the protein's amino acid sequence. These mutations presumably are subject to little selective pressure and accumulate randomly. Thus, as more time elapses, more mutations should accrue. If so, the number of actual synonymous gene changes should offer a rough estimate of the length of time that the genes have been evolving independently, Page explains.

    When the researchers looked at this value for different parts of the chromosomes, they were “stunned,” says Page. He wasn't expecting a clear pattern, but in fact the values for genes on the X chromosome grouped into four “strata” neatly arrayed along the chromosome's length. The genes on the chromosome's long arm were most different from their Y counterparts, and as the scientists examined the opposite end of the chromosome, the genes became more and more similar to their Y doubles.

    To explain this pattern, Lahn and Page propose that the Y chromosome was reshuffled four times, perhaps through a process called inversion, in which a piece of chromosome breaks off, flips over, and reattaches so the order of the genes in that stretch is inverted. Each inversion prevented a stretch of the Y from aligning and exchanging pieces with the matching piece on the X. After all four inversions, the X and Y can now recombine only at their tips.

    To get a rough estimate of when these inversions occurred, the scientists used divergence times that are known from fossils and genetic evidence. For example, two gene pairs in the fourth “stratum” of the X chromosome are still able to recombine in prosimians but have diverged in both Old and New World monkeys, so Lahn and Page estimated that the most recent reshuffling happened between 30 million and 50 million years ago, after monkeys diverged from prosimians but before New and Old World monkeys split. In a similar way, they estimate that the third inversion happened between 80 million and 130 million years ago and the second between 130 million and 170 million years ago. But because only a few genes remain similar in the oldest “layer,” estimating the age of the first rearrangement was tougher. So the scientists used the ages of the three youngest strata as a rudimentary clock and concluded that the oldest section of the chromosomes diverged between 240 million and 320 million years ago—shortly after birds and mammals are thought to have split from their common, reptile-like ancestor.

    Such a scenario fits with the biology of animals today: Many reptiles lack specific sex chromosomes (depending instead on temperature differences during development to modulate individual sex-determining genes), and presumably the reptilian ancestor of birds and mammals lacked sex chromosomes, too. In birds, the avian sex chromosomes, W and Z, seem to be derived from the chromosome pair that is today number nine in humans.

    Indeed, “everything seems to fit together,” says evolutionary biologist Brian Charlesworth of the University of Edinburgh. The result is “really pleasing,” agrees evolutionary biologist James Bull of the University of Texas, Austin, and offers a surprisingly clear evolutionary record. Says Bull: “This is the study that's going to go into the textbooks.”


    Ukrainian KGB Puts Heat on Researchers

    1. Richard Stone

    In an episode that is rekindling memories of Soviet-era repression, Ukrainian security agents last week accused three marine scientists of crimes against the state: shipping sensitive data out of the country and illegally accepting Western currency for research. The unprecedented post-Cold War investigation has stirred an international effort to persuade the Ukrainian government to rein in its version of the KGB before formal charges are brought. Prosecuting the researchers, observers in the West say, could put scientific collaboration with Ukraine into a deep chill.

    Like many talented scientists who have chosen to stay put in the former Soviet Union, Sergey Piontkovski and his team at the Institute of Biology of the Southern Seas (IBSS) in Sevastopol, Ukraine, have supplemented their meager state salaries with grants from Western organizations. Piontkovski has been more successful than most, pulling in grants in recent months from the U.K. government's Darwin Initiative; a European Union agency called INTAS that supports former Soviet scientists; and the U.S. Office of Naval Research (ONR). According to several Ukrainian scientists, jealous co-workers at the institute may be trying to take Piontkovski down: “As far as I know, these people wrote a letter to the local KGB,” says Alexei Mishonov of the Marine Hydrophysical Institute (MHI) in Sevastopol, now a visiting scientist at Texas A&M University in College Station.

    Whatever aroused their interest, on 16 October Ukrainian security bureau (SBU) agents raided the homes and offices of Piontkovski; his former wife, Galina Piontkovskaya, who is also an IBSS scientist; and IBSS deputy director Yuri Tokarev. They seized the researchers' scientific papers, computers, money, and passports. “They confiscated everything,” says Piontkovski, who when contacted at his home by Science claimed that the SBU was monitoring his telephone calls. If convicted of illegal funds transfers, he says, all three scientists could face steep fines and up to 8 years in prison. Tokarev, Piontkovski says, was also accused of passing Soviet-era data to the West. The SBU investigation has since expanded to target MHI scientists also funded by the three grants, says Mishonov.

    Work under the grants involves analyzing and digitizing a wealth of data on plankton bioluminescence collected by over 50 Soviet ocean expeditions from 1970 to 1990, as well as voyages undertaken by Ukraine and Russia after the Soviet Union dissolved. The grants call for making the information, a measure of the ocean's total biomass, available to the scientific community on CD-ROM. “I can hardly see how this kind of plankton studies can represent a risk to the national security of any nation,” says marine biologist Luis D'Croz of the Smithsonian Tropical Research Institute in Panama. “This is simply absurd.” The data “were not classified in any way,” adds marine biologist Robert Williams of the Plymouth Marine Laboratory in the United Kingdom, a co-principal investigator on the ONR and Darwin grants, although he points out that the National Academy of Sciences of Ukraine prohibited the release of Soviet acoustic data that might give insights into submarine movements.

    Ukraine's Byzantine currency laws make it hard for Western officials to evaluate the allegations of illegal funds transfers. “We have told the Ukrainian government time and again that they are creating a hostile environment for investment,” says Gerson Sher, director of the U.S. Civilian Research and Development Foundation. Sher estimates that three times as many tax-free dollars for science would flow into Ukraine if the legal situation were clarified. “It's like the IRS [Internal Revenue Service] in our own country—if they want to get you they'll find a way,” he says. Mishonov agrees: “It's very easy to find a currency law that's broken.” INTAS director David Gould, however, says his agency has abided by the law in funding scientists under a program sanctioned by the Ukrainian government.

    To send a signal that the SBU's own steps are being monitored, the European Union's representative in Kiev has taken up the matter with Ukraine's Ministry of Foreign Affairs, while Piontkovski's colleagues at IBSS and at foreign institutions have appealed to Boris Paton, the powerful academy president, to bring his influence to bear. If the SBU is preparing a broader campaign against Western-funded Ukrainian scientists, warns Williams, he and others who wish to sustain their colleagues may have to keep them at arm's length, for “fear of placing them in jeopardy.”


    Making Social Science Data More Useful

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    Ottawa—How has the North American Free Trade Agreement (NAFTA) affected the economies and labor markets of Canada, the United States, and Mexico? Historian Steven Ruggles of the University of Minnesota, Minneapolis, thought he'd find the answer by examining census data, which include such factors as nature and location of occupation. But such an analysis requires a common database. And that, Ruggles discovered, doesn't exist.

    Not yet, anyway. Last month, Ruggles received a $3.5 million grant from the National Science Foundation (NSF) to help develop an international database of uniformly coded historical censuses from as many as 20 countries. The database could shed light not just on the economic fallout from NAFTA but also on trends in the labor market and related phenomena, including immigration patterns. Tracking the movement of Norwegians to the Americas earlier this century, for example, could illuminate how immigration affects family structure. “How much change [in families] is the product of immigration, and how much of it is just changes that are going on anyway?” Ruggles asks.

    Ruggles's grant is one of six awarded through a $23 million program, “Enhancing Infrastructure for the Social and Behavioral Sciences,” created by Bennett Bertenthal, outgoing NSF assistant director for the Social, Behavioral, and Economic Sciences (see table). Bertenthal was also co-chair of a workshop held here earlier this month sponsored by the 29-nation Organization for Economic Cooperation and Development (OECD). It was the first of four international workshops aimed at “reinventing” the social sciences to make them more empirical and relevant to policy-makers.

    View this table:

    The NSF initiative addresses the serious underfunding of databases, high-speed computers, and networking technologies that has left the social and behavioral sciences far behind the natural and life sciences. The OECD workshops are designed as a forum for applying these and other results to public concerns, from childhood poverty and illiteracy to an aging workforce. “With the scaling up of sciences comes an increase in expectations,” says Bertenthal. “We need to be prepared to provide more comprehensive answers to the questions that are important at the beginning of the 21st century.”

    But, although the future may lie in international comparisons of national social conditions, policies, and programs, the 60 senior researchers and administrators gathered here quickly ran up against the hard realities of a world in which countries hoard their microdata like gold, and statistical agencies often refuse access to researchers on the grounds of preserving confidentiality. There's also a dearth of trained researchers capable of crunching the numbers. And sometimes, as with census information, the data are simply incompatible.

    “Unless the information has been categorized in the same way in different countries, you don't know if you're comparing apples to oranges or apples to apples,” says University of Ottawa historian Lisa Dillon, chair of the International Microdata Access Group (IMAG), which is working with Ruggles on the historical censuses database.

    Gaston Schaber, president of the Luxembourg research center CEPS/INSTEAD, has firsthand experience with that problem. He has spent the last 17 years trying to harmonize data from national household surveys throughout Europe. The creation of standardized international data sets, he argues, would allow governments essentially to test proposed social policies before implementing them. And help is on the way: The European Commission has just approved $1 million to develop a consortium that will create an international database containing longitudinal household surveys from 16 European countries, the United States, and Canada.

    A more general solution, says Research Council of Norway science adviser Trygve Lande, would be a manual to standardize national data-collection methods and definitions. However, a few delegates worried that standardization might intrude on their freedom to design their own surveys, while others said that getting the OECD's elaborate and sluggish bureaucracy to approve such a manual would simply take too long. One way around that problem, says forum co-chair Marc Renaud, president of the Social Sciences and Humanities Research Council of Canada, is to create a high-level task force that would report directly to OECD president Donald Johnston. That approach and other topics will be discussed next spring at a workshop in Bruges, Belgium, on “social sciences evidence-based policies.” A third workshop next fall in Tokyo will examine how the social sciences can spur social innovations.

    Meanwhile, Ruggles and IMAG plan to keep looking for common denominators among the census data that they can apply without sacrificing important details. “Except for age and sex,” says Ruggles, “none of the variables that look the same are actually identical.”

  5. JAPAN

    Millennium Projects May Provide Science Bonanza

    1. Dennis Normile

    Tokyo—Japan's scientists are cautiously applauding the government's choice last week of the types of high-profile science and technology projects to be funded next year as part of a special $5 billion economic restructuring initiative. The so-called Millennium Projects, initiated by Prime Minister Keizo Obuchi, could add as much as $2.4 billion to next year's science budget in information technology, genetics, and environmental studies.

    “We're very happy,” says Ryu Ohsugi, director of the Genome Research Office of the Ministry of Agriculture, Forestry, and Fisheries, which hopes to receive money to speed up work on sequencing and analyzing the rice genome. “The Millennium Projects are emphasizing areas where additional spending is needed,” adds Leo Esaki, a Nobel Prize-winning physicist and former president of the University of Tsukuba. But others are more cautious in their praise. “In some cases, it's hard to tell just what's included,” says an official from the New Research Centers Planning Office of RIKEN (the Institute of Physical and Chemical Research) outside Tokyo, which has proposed five Millennium Projects.

    Obuchi caught the scientific community off guard in August when he asked each agency to nominate research-oriented projects to address social and economic needs in three broadly defined categories: information technology, the aging of society, and the environment. The projects, ideally involving cooperation among government, industry, and academia, are intended to boost the nation's technological prowess and address pressing societal needs. Priority areas for projects of up to 5 years include such goals as connecting all primary and secondary schools to the Internet, creating a paperless government by 2003, enhancing the skills of older workers, reducing the use of dioxin and PCBs, and developing fuel cell-powered cars.

    Although some scientists worried initially that the awards would circumvent established selection procedures, most agencies ended up nominating projects from among those already in the pipeline in the hope of moving them forward or freeing up funds for other projects. Officials also coordinated their proposals to ensure that everyone would get a piece of the pie. “I think we have almost gotten what we wanted,” says Nobuhiro Muroya, deputy director of the Science and Technology Agency's Planning and Evaluation Division. The three categories also proved remarkably flexible, with rice genome work fitting within the “needs of an aging society.”

    In addition, the Millennium Projects will pad Japan's R&D bottom line. Without them, projected science spending would remain relatively flat, at $30 billion for the fiscal year beginning on 1 April. But crediting the entire $2.4 billion to science would mean an annual increase of more than 8%.

    The uncertainties surrounding the projects pose some problems for planners. At RIKEN, for example, it's not clear how much money will go to two projects on the list: a new institute to focus on cell development, differentiation, and regeneration; and a project to study single nucleotide polymorphisms, subtle genetic variations that distinguish human beings that may lead to drugs tailored to an individual's characteristics (Science, 9 July, p. 183). Officials also hope to get some money for a new research center to focus on plant genetics and genomics, including 50 new positions.

    Another uncertainty stems from the fact that Obuchi has invited the public to submit ideas before the list of projects is finalized in December. Just how the public will participate is up in the air, however. Muroya says a telephone hot line is a possibility. Any suggestions will be reviewed by the Council for Science and Technology, the nation's highest science advisory body, which is chaired by the prime minister.


    Leptin Not Impressive in Clinical Trial

    1. Trisha Gura*
    1. Trisha Gura is a writer in Cleveland.

    Like a promising starlet with her first box-office flop, the hormone leptin, which made a stunning debut 5 years ago as a potential weight-loss drug, has met with disappointment after the conclusion of its first clinical trial in humans. On a positive note, the results, which appear in the 27 October issue of The Journal of the American Medical Association, show that some study participants given leptin lost more weight than controls. The differences were statistically significant, however, only in obese subjects given the two highest leptin doses.

    And because it's a protein, the hormone had to be injected, which produced redness and swelling severe enough to cause subjects to drop out of the study early. What's more, some obese volunteers cheated on their required diets. As a result, they gained weight despite receiving the drug. “I was not very impressed,” says Jeffrey Flier, a longtime leptin researcher at Harvard Medical School in Boston. “Leptin, given the way the researchers gave it to a group of people with a common variety of obesity, is relatively ineffective in most of them.”

    There still may be hope for leptin as a diet therapy, however, if researchers can uncover what made the few individuals who responded to the drug more sensitive; it could enable them to identify patients likely to benefit. But that sort of finding, even if it comes soon, will still leave the majority of obese individuals—an estimated one-third of the U.S. population—to struggle with therapies that don't really seem to work.

    The story of leptin has had as many ups and downs as a chronic dieter's bathroom scale. First discovered in 1994 by a team led by Jeffrey Friedman at The Rockefeller University in New York City, the appetite-suppressing hormone hit the headlines accompanied by pictures of mice that were grossly overweight as a result of leptin gene mutations. Rockefeller University subsequently licensed the human version of leptin to the biotech firm Amgen Inc. of Thousand Oaks, California, for an initial fee of $20 million, and researchers scrambled buoyantly toward the hope of using the hormone as a wonder diet drug.

    One sign of trouble came, however, when researchers learned that most obese human patients do not have depleted leptin levels in their bloodstreams. In fact, the heavier the person, the higher the levels. Still, that did not rule out the idea that leptin might work as an obesity therapy. Patients with adult-onset diabetes make insulin, but develop a resistance to the hormone that can sometimes be overcome by insulin injections. Hoping for something similar with leptin, a team led by Mark McCamish at Amgen devised a clinical trial designed to quickly test its safety and efficacy in humans.

    Investigators at six sites recruited 54 normal and 73 obese volunteers, who were assigned to receive injections either of a placebo or of leptin at one of four different doses. Obese volunteers were also instructed to diet. After 4 weeks, when the drug appeared to be safe, the obese volunteers were given the choice to go the full 24 weeks planned for the study. Forty-seven completed the study, the rest having dropped out either because of problems with injection sites or noncompliance with the diet or the injections.

    At the end of the trial, individuals in the highest dose group had lost a mean weight of 7.1 kilograms as compared to 1.3 kilograms lost by the controls, who were given no drugs. “Our data show that there is not an absolute leptin resistance in this group of [obese] people,” says study co-author, nutritionist Steven Heymsfield of St. Luke's-Roosevelt Hospital Center in New York City.

    That may be true, say other researchers, including Flier, but a close look at the numbers shows that only patients in the highest two dose groups had significant weight losses compared to those taking the placebo. And even then, several in those groups actually gained weight during the study. “It's looking very much as if there is no real effect except in a subset of patients,” Flier says.

    The Amgen team now plans to try to identify exactly what factors made those people more responsive to leptin. If successful, says Heymsfield, “one could screen people beforehand for certain markers and then know the probability of responding would be much higher.” He notes that researchers at Addenbrooke Hospital in Cambridge, U.K., reported in the 16 September issue of The New England Journal of Medicine that leptin did cure the obesity of a youngster found to have defects in his leptin gene. Such mutations are rare, however, and until researchers learn to identify people with more common types of obesity that respond to leptin, the drug is not likely to appear in pharmacies.


    Tweaking the Clock of Radioactive Decay

    1. Richard A. Kerr

    Certainty, it seems, is on the wane. The sun may rise tomorrow on schedule, and the seasons may pass as they always have. But radioactive decay—the pacemaker of geologic time—can no longer be called precisely “clocklike.” Says geochemist Douglas Hammond of the University of Southern California (USC) in Los Angeles: “Everybody always assumes radioactive decay to be totally independent of temperature, pressure, and chemical form. It seems there are some exceptions.”

    In the 15 September issue of Earth and Planetary Science Letters, geochemist Chih-An Huh of the Institute of Earth Sciences of the Academia Sinica in Taipei reports that the decay rate of beryllium-7 varies, depending on its chemical form. Creationists hoping to trim geologic history to biblical proportions will be disappointed—the variations seen so far are much too small, just a percent or so, to affect Earth's overall time scale. Still, the variability in beryllium decay will prompt those who want to trace out fine divisions in the earliest reaches of time to take a close look at their pacemakers.

    Theoreticians long ago anticipated some variability of radioactive decay. The decay of beryllium-7, for example, should depend on the density of electrons at the nucleus. That's because it transforms itself into lithium-7 by capturing one of its own electrons, turning one of its protons into a neutron, and emitting a gamma ray. When a change in chemical bonding subtly rearranges the electrons and increases an electron's chance of finding itself at the nucleus, the odds are better that it will be captured and the beryllium will decay.

    In the last few years, German researchers have demonstrated the converse of this effect: a surge in the decay of rhenium-187, which emits an electron rather than capturing one. When Fritz Bosch and his colleagues at the Gesellschaft für Schwerionenforschung in Darmstadt, Germany, stripped away all the electrons from rhenium nuclei, something that might happen in a star's harsh interior, its half-life plummeted from 42 billion years to 33 years. But, until now, researchers have detected only tiny variations (or none at all) in the decay rate of beryllium and other atoms under Earth-like conditions.

    Undismayed, Huh applied the latest technology to the problem. He used an extremely sensitive but stable gamma ray spectrometer to monitor the decay of beryllium-7 (which has a half-life of about 53.3 days) in the form of the hydrated ion, the hydroxide, and the oxide—chemical combinations common in the environment. Thanks to an unprecedented precision of ±0.01%, he could see that the half-lives of the three forms were 53.69 days, 53.42 days, and 54.23 days, respectively. The 1.5% range is “probably quite real,” says geochemist Teh-Lung Ku of USC. “Although the idea has been around quite a while, this time [the researchers] will be able to show it more convincingly.”

    It remains to be seen how important the effect will be in dating geologic samples. Beryllium-7 is used to gauge the rate of erosion or sediment deposition over weeks to months. Except perhaps in studies aspiring to the highest possible resolution, the decay variability due to chemical form is likely to be swamped by other uncertainties, says Hammond.

    If similar effects turn up in other radioactive clocks that tick over hundreds of millions or even billions of years, however, they would loom large to geochronologists trying to work out the order of closely spaced geologic events in the distant past. Such fine distinctions matter, for example, to researchers who are using the decay of potassium-40 (half-life of 1.25 billion years) to sort out the mass extinction of 250 million years ago (Science, 15 May 1998, p. 1007). But, although potassium-40, like beryllium-7, decays by electron capture, its innermost electrons—the ones most likely to be snagged—are more strongly shielded from external effects. The potassium ion has two complete shells of electrons protecting its two innermost electrons, whereas the beryllium ion has none. Thus, researchers expect the effect of chemical form on potassium-40 to be far less than on beryllium-7.

    But that won't stop Huh from trying to check the constancy of this clock. Even now, he is counting decay rates of rubidium-83. It has an electronic structure that provides even more shielding than does potassium-40, but its 86-day half-life will make experiments reasonably quick to perform. In a few months, he'll know if ancient days are even a tiny bit closer than we thought.


    Shalala Takes Watchdog Office Out of the Hunt

    1. Jocelyn Kaiser

    The Department of Health and Human Services (HHS) has decided to downgrade the role of its Office of Research Integrity (ORI) in policing scientific misconduct. The change, in line with a new government-wide policy, strips ORI of the power to conduct investigations and, instead, asks it to teach universities how to prevent misconduct. It will continue to review the results of university investigations and propose sanctions. “ORI now goes into the oversight/recommendation role,” says Chris Pascal, acting director of the office, which became notorious a decade ago for its dogged pursuit of allegations against a colleague of Nobelist David Baltimore.

    Created in 1989 and assigned its present status in 1992, ORI has had responsibility for both investigating misconduct by HHS-funded researchers and imposing sanctions. But the agency's effectiveness was weakened by several instances in which charges against individuals were later abandoned or findings overturned on appeal. The decision by HHS Secretary Donna Shalala essentially adopts a 4-year-old recommendation by a congressionally appointed commission headed by Harvard reproductive biologist Kenneth Ryan (Science, 1 December 1995, p. 1431).

    The new plan, formulated by an internal review panel headed by Assistant Secretary for Health (ASH) David Satcher, makes HHS dependent primarily on an institution's own investigation. ORI will review the findings and, if necessary, draw up sanctions, which it will send to Satcher as recommendations. “The ASH has no role right now [in that process],” says Pascal. Any additional investigation will be conducted by HHS's inspector-general (IG). Appeals will continue to be heard by a separate HHS panel of experts.

    The eight scientist-investigators in ORI's investigative unit will concentrate on oversight and onsite technical assistance, Pascal says. HHS may also provide more direct support: It will soon launch a pilot project to assist institutions unable or unwilling to do their own investigations by offering them help from a consortium of experienced universities. The new scheme is consistent with the approach taken by the National Science Foundation, where the IG handles misconduct investigations and forwards its recommendations to the deputy NSF director. Pascal says ORI has been relying on universities to do most investigations since 1995, when it began to limit the number of cases it pursues. Barbara Mishkin, an attorney at the Washington, D.C., law firm of Hogan & Hartson, who has specialized in misconduct cases, says that ORI has improved its reputation in recent years by training investigators and “being much more selective” about choosing cases.

    In announcing the changes at ORI, Shalala also said the department will adopt a newly proposed federal research misconduct definition that would limit misconduct to fabrication, falsification, and plagiarism (Science, 15 October, p. 391). University of California, Berkeley, biochemist Howard Schachman, speaking for the Federation of American Societies for Experimental Biology, says the new procedures recognize that universities have learned a lot about handling misconduct cases in the past 10 years. “I'm ecstatic about how this has come out.”


    Cleared of Misconduct, Geoscientist Sues Critics

    1. David Malakoff

    Ronald Dorn, a prominent geoscientist at Arizona State University (ASU) in Tempe, has filed suit against the authors of an article, published last year in Science, who raised doubts about some of his work. Dorn is charging that statements made in the article, along with other comments by some of the authors, implied that he had doctored rock samples used to date ancient stone carvings. Earlier this month, two investigations concluded that Dorn did not commit scientific misconduct, and last week Dorn finished officially informing the eight scientists that he is suing them for defamation. Both sides are staying mum about the suit, but some observers worry that the litigation could deter potential whistleblowers and chill public discussion of scientific controversies.

    The suit is based on a 4-year-old controversy that revolves around a dating technique that Dorn developed in the mid-1980s but abandoned as flawed in 1996. To date stone carvings and geological features such as old shorelines, Dorn used acid to extract microscopic quantities of organic material, including plant remains, from beneath a thin layer of natural varnish on rock surfaces. He then sent the material to an accelerator mass spectrometry (AMS) laboratory to measure the amount of radioactive carbon-14, which decays at a known rate, that was present in the samples. The technique became controversial after it yielded ages for some stone artifacts from the southwestern United States that were several thousand years older than those accepted by many archaeologists.

    In 1996, geoscientist Warren Beck of the AMS laboratory at the University of Arizona, Tucson, discovered that a sample of Dorn's that he was processing contained coal and charcoal grains of vastly different ages. Those variations, he and co-authors later wrote in Science, made the dates obtained by the technique “ambiguous” (Science, 26 June 1998, p. 2132). They also noted that they were unable to find the grains in samples that were not prepared in Dorn's lab. Beck, geochemist Wallace Broecker of Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York, and other researchers co-authored a paper that was eventually submitted to Science, which published it last June as a Technical Comment accompanied by a response from Dorn.

    Although the authors did not accuse Dorn of misconduct, several shared their findings with officials at ASU and the National Science Foundation (NSF), which had funded some of Dorn's work. Both organizations began investigations of whether Dorn had manipulated the ages of his samples by adding the carbon grains.

    This month ASU and NSF cleared Dorn. A faculty panel established by ASU concluded that “the evidence did not support allegations that Dorn added coal or charcoal to rock varnish samples” and that studies showed the materials occurred together naturally. In June, even before the finding was released, however, Dorn moved to file suit against the authors of the Science paper, charging that their statements were “published with an ‘evil heart.’” His amended complaint, filed on 5 October in Maricopa County Superior Court, cites remarks attributed to Beck and Broecker by the Arizona Daily Star, and letters from Broecker to ASU and NSF, that deal with the feasibility of doctoring samples. In the complaint, Dorn says such remarks “clearly implied professional misconduct” and “seriously damaged” his ability to win grants, although the suit claims no specific amount for damages. In addition to Beck and Broecker, Dorn is suing Douglas Donahue, A.J.T. Jull, and George Burr of the University of Arizona's AMS Laboratory; Broecker's employer, Columbia University; linguist and rock art researcher Ekkehart Malotki of Northern Arizona University in Flagstaff; and Georges Bonani and Irka Hajdas of the Swiss Federal Institute of Technology in Zurich.

    Lawyers say Dorn's case may rest on whether he can show that the authors went beyond normal academic discourse in criticizing him. Gilbert Whittemore, whose firm, Stalter & Kennedy in Boston, is not involved in the case, says the possibility of such litigation could prompt researchers to avoid future controversies. “Scientific disputes normally get worked out by a rip-roaring debate in the literature,” he says, not in the courtroom.

  10. MEXICO

    Student Strike Engulfs Research Activities

    1. Jocelyn Kaiser

    A student strike that has gripped Mexico's main university for 6 months has now spread to the school's research institutions. Faculty members have already been hampered by months of delays at student-controlled checkpoints, thefts of equipment, and other hassles, and last week some scientists spent hours negotiating for the right to keep their labs open. “The damage … will be difficult to repair,” says Jaime Urrutia, director of the Institute of Geophysics.

    The protests at the Mexico City campus of UNAM—the 260,000-student National Autonomous University of Mexico—began on 20 April when student activists protested a proposed hike in tuition from pennies to about $150 a year. In June the university abandoned that plan, but the strike has continued, with students now demanding an end to all fees, looser admissions and graduation standards, and much more power on UNAM's governing council. More turmoil could lie ahead if university workers follow through on a threat to strike for higher wages. The school's 2000 researchers publish about half the papers by Mexico's scientists, and hundreds of faculty members have signed a letter asking the government to enforce the law and restore order. But Mexican President Ernesto Zedillo so far has refused to intervene in apparent fear of stirring public opposition in an election year.

    Although students have occupied schools and teaching buildings for months, it was not until 18 October that they began to invade some of the 24 research institutes and centers, shutting down parts of the geography, geology, and geophysics buildings. Physicists and applied mathematicians convinced the activists to keep their buildings open, however, and several geophysicists told Science by e-mail that their colleagues had argued successfully that work such as monitoring of the nearby Popocatépetl volcano should go on. The geosciences, says Urrutia, “are particularly important because of the [recent] earthquakes, eruptive activity, and flooding of the past months.”

    Even though most research labs are still functioning, scientists say the prolonged shutdown of much of the campus has impeded their work. “The major inconvenience [for us] is traffic and communication problems,” says Fernando Lopez-Casillas of the Institute of Cell Physiology. Each day researchers must pass through student-controlled gates, where they face verbal harassment. The strikers have also made it difficult to leave campus with equipment, including new seismographs for monitoring Popo. Overnight deliveries can only be picked up off-campus. The strikes have also touched off a wave of vandalism and robberies, including reported thefts of computers and several vehicles used by scientists.

    An open letter from UNAM faculty and many international colleagues calls on the Mexican government to impose the “rule of law” ( The letter says that complying with strikers' demands, coupled with proposed cuts in UNAM's budget, “would leave us with a totally devalued institution, putting at grave risk one of the most ambitious and successful public and national university projects in Latin America.” But Zedillo, according to Jose Antonio Zabalgoitia, a spokesperson at the Mexican embassy in Washington, D.C., “will not order any police or public force to drive the strikers out without strong evidence from the university community itself that they are not supporting the strikers and they want the strike over.” Some scientists say they may have to hunker down until the political climate is more favorable: “We hope the problem will clear up once the primaries are over” next month, notes UNAM seismologist Cinna Lomnitz.

    Meanwhile, Lopez-Casillas says he plans to take a computer home to write papers on his work characterizing the transforming growth factor-β receptor if his institute is closed. But labs working with higher animals, such as a colleague's highly regarded research using monkeys to study sensory perception, “would probably be totally destroyed” if the strike encompassed them, he says. Adds Lourival Possani, a biochemist at the biotechnology institute in Cuernavaca, 100 kilometers from Mexico City, “If they shut down the research, it's going to be a disaster for the entire country.”


    Candidate 'Gene Silencers' Found

    1. Evelyn Strauss

    Sometimes genes don't add up. About a decade ago, researchers added additional copies of pigment genes to petunias, hoping to darken their purple flowers. Instead, the petals turned white. Biologists now know that in many organisms, including plants, worms, and flies, adding an extra dose of a gene can have the paradoxical effect of slashing that gene's expression. This phenomenon, which goes by the name posttranscriptional gene silencing (PTGS) and helps organisms defend themselves against viral and other foreign nucleic acids, occurs because the added gene somehow causes destruction of the messenger RNA (mRNA) made by both it and the corresponding cellular gene. As a result, production of the gene's protein product shuts down. Now, researchers may have found the tracking system that homes in on the mRNA and triggers its destruction.

    Because gene silencing targets specific mRNAs, many people have thought that so-called antisense RNA—RNA with a nucleotide sequence complementary to the gene's mRNA—might be involved, possibly as a tag that marks the mRNA for degradation. They have been unable to identify those antisense RNAs or any other nucleic acid involved in silencing, however. But on page 950, molecular geneticists Andrew Hamilton and David Baulcombe of the John Innes Centre in Norwich, U.K., report that they have come up with a likely prospect: short RNA snippets, 25 nucleotides long, that match the gene being silenced.

    “This could be just what we're looking for. It's the first good candidate for RNA molecules that have a role in PTGS,” says Richard Jorgensen, a molecular geneticist at the University of Arizona, Tucson. The work should “lead to a much more mechanistic understanding of the process than we currently have.”

    Baulcombe and Hamilton, a postdoc in Baulcombe's lab, suspected that previous workers had failed to find antisense RNAs in silencing because the molecules were so small that they were running through the analytical gels too quickly for researchers to detect them. For the new search, Hamilton first added a gene that encodes a plant enzyme, called ACO, to each of five tomato plant lines. In two of them, the added gene led to the silencing of the endogenous ACO gene as indicated by the disappearance of its RNA.

    The researchers extracted nucleic acids from the plant leaves, enriched the cellular mixtures for low-molecular-weight molecules, and separated the components using a gel system designed to retain small molecules. Hamilton then probed the nucleic acids with a piece of radioactive RNA that specifically binds to antisense ACO sequences. This probe picked up a 25-nucleotide molecule from the two lines where the gene was silenced but not from the three others. The researchers proved that the molecule was in fact RNA by showing that it disappeared from samples subjected to enzymes and chemicals that destroy RNA but not DNA.

    To see whether similar RNAs would turn up in other silencing examples, Hamilton added a nucleic acid containing green fluorescent protein sequences to a leaf of a tobacco-related plant that already had a gene for GFP engineered into all its cells. A gene introduced in one place in an organism can silence the corresponding RNA at distant locations, and, in keeping with that, by several weeks later the GFP fluorescence had disappeared throughout the plant. Again, the team detected 25-nucleotide GFP antisense RNA in tissues exhibiting PTGS but not in control plants. The researchers analyzed several other examples of PTGS and in all cases they found a 25-nucleotide antisense RNA specific for the silenced gene.

    The findings don't distinguish whether these antisense molecules cause silencing or are byproducts of it. If they're the cause, they may be made by an enzyme called RNA-dependent RNA polymerase (RdRP), which copies one RNA from another, creating antisense fragments. Researchers note, for example, that RdRP levels in plant cells rise upon viral infection, when gene silencing takes place. Alternatively, the 25-nucleotide RNAs may be debris left by an enzyme that chews RNA down to precisely that size. Either way, the existence of such uniform-sized RNAs provides insight into silencing, says Phillip Sharp, a biochemist at the Massachusetts Institute of Technology: “That's really remarkable. … There's a very precise biochemical mechanism in there.”

    Besides pinning down what the RNAs are doing and how they are made, researchers would like to use them to enter the natural world of PTGS. Beyond acting as a defense against foreign nucleic acids, the normal role of PTGS is largely a mystery. “Can we find these 25-nucleotide RNAs in plants that don't contain any foreign DNA at all?” asks Baulcombe. “If so, what are they specific for? That will give us ideas about the processes they control.”

    And of course, scientists wonder whether the findings in plants apply to other organisms. “I can guarantee there will be a lot of flies and worms ground up” to look for a small RNA, Sharp says.


    Fetal Cells Help Parkinson's Patients

    1. Laura Helmuth

    Miami Beach, Florida—A controversial therapy that involves injecting fetal cells into the brains of Parkinson's patients can slow down the progression of the disease, according to the first double-blind, placebo-controlled clinical study of the procedure. The study, presented here on 24 October at the Society for Neuroscience's annual meeting, shows that the fetal cells can produce a critical neurotransmitter, reducing patients' tremors and paralysis.

    Parkinson's disease is marked by the death of brain cells that make the neurotransmitter dopamine. Since the 1980s, researchers have been developing a technique to substitute those brain cells with fetal cells destined to produce dopamine. In 1994, a team led by Curt Freed of the University of Colorado, Denver, received the first grant from the National Institutes of Health for a double-blind, placebo-controlled study of fetal cell transplants in human patients.

    Forty patients with advanced Parkinson's disease underwent an operation in which a long needle was inserted through the forehead in four places, under local anesthesia. In half of the patients, the needles delivered small amounts of brain tissue—derived from four 7- to 8-week-old embryos—to the putamen, one of the brain areas affected by Parkinson's. The other patients constituted a control group. For them, the operation was a sham; nothing was injected into their brains.

    One year after the operation, the control group hadn't improved. But the fetal tissue seemed to have taken hold in the patients who received a transplant: Positron emission tomography scans showed a 20% or better increase in dopamine activity in the putamen in more than two-thirds of the treated patients. Patients aged 60 or younger showed a marked reduction in Parkinson's symptoms, while older patients improved only slightly compared to the controls. Even after 36 months, the transplant group was doing better than the controls, Freed reported at the meeting.

    The results of the trial are “modest, but [it was] very well done,” says Roy Bakay, a neurosurgeon at Emory University in Atlanta. “It's the first study, and there are going to be advances in technology that will be exponential.” New techniques to help fetal cells survive the transplant, for example, should lead to more dramatic clinical benefits, he predicts. Bakay adds that if the few human trials still under way also show no placebo effect from a sham operation, he hopes the Food and Drug Administration will remove one ethical objection to such research by allowing researchers to pit one treatment against another. “Maybe after a few of these studies, we shouldn't have to do sham operations anymore,” he says.


    Researchers Plan Free Global Preprint Archive

    1. Eliot Marshall

    While the National Institutes of Health (NIH) moves ahead with plans to create a free database of biological publications, a group of research librarians and information experts is trying to concoct something more far-reaching. The leaders—who are following the model of the Los Alamos National Laboratory (LANL) physics archive—met last week in Santa Fe, New Mexico, to begin working out the framework for a “universal preprint archive” that would include papers from all disciplines. By November, according to spokesperson Herbert Van de Sompel of the University of Ghent in Belgium, the group hopes to release a set of indexing protocols that would permit authors to deposit their work at participating sites and readers to retrieve the full text at no cost.

    Van de Sompel, an expert on digital libraries, teamed up with Paul Ginsparg, founder of the LANL archive, and LANL research library director Rick Luce to organize last week's meeting. In attendance were more than 20 information specialists representing a variety of institutions, from Harvard University and the Massachusetts Institute of Technology to NASA and the U.S. Library of Congress. All support the idea of making scientific papers freely accessible to the public, although individual participants differ on specifics, such as how to handle non-peer-reviewed material.

    The group aims to encourage the growth of preprint repositories such as the Los Alamos archive and knit them together with a set of protocols. Ginsparg's project at LANL began in 1991 as an archive for physics. Now it contains more than 100,000 papers on math, physics, and computer science. Ginsparg declined to discuss the new project in detail but said, “The hope is … [to] catalyze real progress in new scholarly publishing models over the next 5 to 10 years” (see

    Several groups have already established preprint archives in their own disciplines, some of which have grown rapidly. For example, economists have organized several repositories in a site called Research Papers in Economics, coordinated by Thomas Krichel of the University of Surrey, U.K. ( And Stevan Harnad of the University of Southampton, U.K., oversees CogPrints, a collection of papers in cognitive science, psychology, neurology, linguistics, and related fields ( Last week's meeting was aimed at stimulating other grass-roots efforts.

    Van de Sompel says they “managed to agree on some important technical matters that will enable the creation of cross-archive end-user services,” which are now being worked out in detail. The format is likely to follow a model described in a draft “Santa Fe Agreement” released earlier this month by Krichel. This draft, which lacks the indexing tags agreed upon last week, establishes a process by which archives and data providers can affiliate with the group. For example, it requires unanimous consent for changes and declares that the objective is “open and cooperative” sharing of data.

    The Santa Fe effort differs in tone from NIH's PubMed Central: It's more radical. At present, the latter is gearing up to be a distributor of traditional peer-reviewed articles. But the Santa Fe archivists are focused on another type of scholarly discourse, one in which editors, peer reviewers, and paper will be optional.


    First Glimpse of a Cosmic Funnel

    1. Mark Sincell*
    1. Mark Sincell is a science writer in Houston.

    Astronomers have caught their first-ever glimpse of the funnel that channels a fountain of subatomic particles, erupting from the center of a galaxy, into a narrow stream thousands of light-years long. The radio images, published in the 28 October issue of Nature, hint that a sheath of twisted magnetic fields focuses the particle stream.

    Many galaxies have turbulent hearts that emit powerful beams of radio-emitting plasma like this one, which streaks from the center of M87, a galaxy 50 million light-years from Earth. Resembling spotlights at a Hollywood movie premier, such beams are probably generated as matter plunges into a supermassive black hole at the center of the galaxy. Magnetic fields churned up by the swirling, superheated matter presumably squeeze the material into a beam, but astronomers had been unable to locate the focusing “lens.” The main problem is that the postulated lens must lie very close to the black hole itself, and it takes a radio telescope the size of Earth to see detail that fine at the center of a distant galaxy.

    Fortunately, a team of radio astronomers led by Bill Junor at the University of New Mexico, Albuquerque, had access to just such a telescope: the Very Long Baseline Array (VLBA). Consisting of 16 electronically linked radio dishes extending from Hawaii to Italy, the VLBA imitates the resolving power of a single telescope with a 10,000-kilometer-wide dish. “It is a wonderful instrument,” says Junor.

    Junor's image of M87 shows a 60-degree plasma cone a few hundredths of a light-year long emerging from the center of M87. Because this cone feeds directly into the 1000-light-year-long, 6-degree-wide jet seen in older images, Junor's team concludes that something is squeezing the plasma into a tight stream. “We want to invoke magnetic fields to wrap the jet,” says Junor, but testing that theory with computer models “is a horrendously complicated problem.” Maybe so, but astronomer Meg Urry of the Space Telescope Science Institute in Baltimore says, “these observations are an important first step” to understanding the focusing mechanism.


    Turning Thoughts Into Actions

    1. Marcia Barinaga

    A flurry of new work suggests that paralysis victims could soon benefit from devices that translate their thoughts into commands to operate computers or even robot limbs

    Performing a task any healthy rodent can master, a rat in a video depresses a bar in its cage to get a metal arm to swing inside and deliver a sip of water. The next time around, the arm swings early—before the rat has even finished pressing down. “Do you see the surprise in his eyes?” jokes neuroscientist John Chapin of Hahnemann University in Philadelphia, knowing full well that those beady black orbs don't betray that emotion. However, this is not a prank some character in a “Far Side” cartoon might pull. The water arrived a fraction of a second early because the arm was driven by neurons in the rat's brain that signaled its intention to press the bar with its paw.

    Chapin's rats are test subjects for pioneering machines that can translate thoughts into action without the need to twitch a muscle. The ultimate beneficiaries of these now-primitive devices could be millions of quadriplegics and the half-million or more patients worldwide who are “locked in”—so utterly paralyzed that some are unable even to blink. A tool that could read their thoughts to execute commands or compose letters on a computer, for instance, would become a lifeline to the outside world. For many, such a modicum of independence may offer a reason to live.

    “These people have nothing,” says neuroscientist Emanuel Donchin of the University of Illinois, Urbana-Champaign, of the locked-in patients he hopes to help. “A lot are allowed to die,” he notes, and some, who gradually become locked in through progressive diseases such as amyotrophic lateral sclerosis (ALS), may choose to end their lives. “[If] you could offer them some minimal quality of life, they might choose to [live].”

    Only a few years ago, thought-driven machines were the stuff of science fiction, not science. But the field of neuroprosthetic devices is advancing quickly, spurred by an influx of talented researchers who say the brain may be more primed than anyone had realized to learn to treat a cursor or a robotic arm as an extension of the self. A first generation of communication devices, driven by brain waves detected from outside the body or by electrodes implanted in the brain, is now being tested in paralyzed patients. And recent research, some of it presented earlier this week in Miami at the annual meeting of the Society for Neuroscience, suggests that more sophisticated devices allowing people to operate a robotic arm by thought alone may not be far behind.

    Many technical hurdles remain, including the need for longer lasting electrode arrays that are more compatible with brain tissue and the puzzle of how to make robotic limbs move and respond in the most realistic manner. But such obstacles don't seem insurmountable, experts say. “A lot of these things are engineering problems, not scientific problems, and they will be solved,” says neuroscientist Andrew Schwartz of the Neurosciences Institute in La Jolla, California, who is collaborating with an engineering team at Arizona State University to develop a prosthetic arm.

    Work on prosthetic devices that tap directly into the nervous system has been going on for years and already has a great success story: cochlear implants that stimulate the auditory nerve, which are now widely used to restore hearing to deaf patients. Then, in the 1980s, a band of researchers began devising techniques for making information flow the other way—wiretapping the brain's internal communications.

    Some chose to develop devices that could be steered by brain waves detected outside the head; research more than 30 years ago showed that people can be trained to control these patterns. Another group of researchers was inspired by findings on how populations of neurons in the brain's motor regions work together to encode the direction of limb movements, advances led by neuroscientist Apostolos Georgopoulos of The Johns Hopkins University School of Medicine. Those discoveries, as well as the advent of electrodes that could record from dozens of neurons at once, raised the possibility that one could tap those “population codes” to drive prosthetic limbs.

    The first patients to benefit from brain-driven prostheses are likely to be people with locked-in syndrome. Stricken with advanced ALS, brainstem strokes, or traumatic injuries, they are totally paralyzed, their breathing controlled by ventilators. For the fraction of locked-in individuals who can move their eyes, computer setups can track eye movements and allow the patients to spell out words by looking at letters on a screen. But the most severely locked in do not have reliable control of their eyes.

    It is for those patients that researchers such as Neils Birbaumer of the University of Tübingen, Germany, and Jonathan Wolpaw of the New York State Department of Health's Wadsworth Center in Albany are working. They set out more than a decade ago to harness for communication the ability of people to control their electroencephalograms (EEGs), the patterns of brain waves detectable by electrodes placed on the scalp. The EEG is the sound of the brain at work, an electrical hum escaping from the skull that reflects the overall activity of neurons firing in the cerebral cortex.

    The two teams have used similar strategies: linking EEG patterns to the movement of a cursor on a computer screen, then training subjects to control cursor movements by altering their EEGs. Birbaumer's team focused on a particular EEG readout called the slow wave, which reflects the overall negative or positive charge of the EEG pattern. They taught patients to control the wave by playing a simple game: “On the computer you see the negativity in the form of a light ball which moves across the screen toward a goal,” he says. “If it reaches high enough amplitude, the goal lights up.” Because the EEG measures the average of many types of brain activity, it's hard to instruct patients on how to control it; they must learn through trial and error.

    Wolpaw prefers to use the “mu” wave, which emanates from the scalp near areas in the cortex that control movement. He says his subjects often begin learning to control it by imagining themselves playing a sport. “Eventually,” he says, “they say they don't need the imagery anymore.” He has found that healthy subjects can gain control of a cursor within 10 sessions and thereafter become faster and more accurate. The subjects can answer four yes-or-no questions in a minute, about a quarter of the speed it would take to say the answers.

    Birbaumer's team, meanwhile, built a device that lets people use a cursor to spell out words. The subjects answer yes-or-no questions to indicate which half of a progressively divided alphabet contains the letter they want. The team has so far trained three locked-in patients to use the device. The pace is slow—two or three letters per minute at best—and can be exhausting, says Birbaumer. But, he adds, his patients are so glad to be able to communicate that they “don't care about speed at all.”

    Despite the appeal of this noninvasive technique, most researchers agree that EEG patterns cannot convey enough information to generate the complex and rapid commands necessary to move a prosthetic limb. “That will never happen,” says Birbaumer. But that mountain may be climbed with another class of neuroprostheses, in which neural signals are gathered directly via electrodes implanted in the brain.

    The first of this breed has just made its debut in human beings. Philip Kennedy, chief scientist of Neural Signals Inc. in Atlanta, Georgia, has built an electrode that releases a cocktail of growth-promoting compounds. The biochemicals coax neurons to sprout projections that grow into the electrode's porous tip, where it registers their activity. In 1996, Kennedy received approval from the Food and Drug Administration to test the electrode in humans; his collaborator, neurosurgeon Ray Bakay, has so far implanted it in three locked-in patients.

    Because his goal was to have patients use their neural activity to move things—first a cursor on a computer screen and someday a prosthetic arm—Bakay placed the electrode in a brain area involved in controlling movement, the primary motor cortex. He used functional magnetic resonance imaging to visualize brain activity while the patients imagined moving a hand. Next, he inserted the electrode in the brain region active during the test.

    One patient died from her disease shortly after surgery, while another had the procedure in July and is waiting for the neurons to finish growing into the electrode. But the third, a 53-year-old man paralyzed by a brainstem stroke, has had the electrode for more than a year, and he has learned to direct a cursor to choose letters and spell messages. First the man succeeded by thinking about moving his hand, says Kennedy, but now he simply thinks about the cursor itself to make it move.

    That testifies to the brain's remarkable plasticity, Kennedy says. Scientists have known for years that unused portions of the higher brain can take on new functions—for example, the visual cortex of a blind person can register the tactile sensations for reading Braille—or that a neighboring brain area can take over the functions of one destroyed by a stroke. In Kennedy's patient, it appears that the brain area once devoted to moving the patient's now-paralyzed hand has adjusted to its newfound ability to move the cursor, evolving into what Kennedy calls a “cursor-related cortex.” Such adaptability will be the key to success for future devices, says Kennedy, adding that “Plasticity is our friend.”

    However, the device has been tested only in a single patient so far. And although Kennedy and others believe that direct recording from neurons will eventually prove superior to EEG-driven devices, they have a long way to go before the benefits of implanting them into a lot of patients outweigh the risks of surgery. Birbaumer's EEG patients “can do three letters a minute, and on a good day [my patient] can do better than that,” says Kennedy. “But he can't do 10 times better.”

    Getting brain neurons to drive prosthetic limbs will require electrodes, now under development, that can tap more than just a handful of neurons. “If the technology can generalize to … recording from 100 or 200 [neurons], that is when you are going to see a major step forward,” says William Heetderks, director of the Neural Prosthesis Program at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland.

    Indeed, a tide of neuroscientists surging into prosthesis research got involved through a quest to understand how the brain encodes information. Their approach is to analyze electrical patterns recorded from dozens of neurons at once (Science, 17 April 1998, p. 376). “We were interested in this for mainly theoretical reasons,” says Chapin. “The issue is how the brain works, and neural population coding is the essence of that problem. Then, the idea of solving a practical problem, which is neural prostheses, came along.”

    Chapin's team at Hahnemann wanted its rats to perform a task requiring a specific force of paw pressure, and precise timing, to see if the code for those elements could be deciphered from the animals' neural activity and translated into moving a mechanical arm. The team members devised a task in which a spring-loaded bar controlled a swinging arm with a cup at the end. A rat had to push the bar down just far enough to swing the arm until the cup was under a stream of dripping water outside the cage, hold the cup there to catch some drops, then release the lever, allowing the arm to return so they could drink.

    In the region of the rat's motor cortex that controls paw movement, the team implanted an array of 24 microelectrodes that could record the firings of up to 48 neurons at once. “We found that the activity in the brain just before the animal pressed the lever carried a code for how far down the lever would be pressed,” says Chapin. The team fed the neural firing pattern into a computer that would translate it into a signal to move the arm the right distance. Then “in the middle of the experiment, we switched the switch, and suddenly the robot arm was being controlled from the brain,” Chapin says. “The question was: Can the animal still move the robot arm to the water? The answer was that he could.”

    Chapin's team, with collaborators led by Miguel Nicolelis at Duke University, is trying to repeat its success in monkeys, whose brains more closely resemble our own. Several others are also in the race. Groups led by John Donoghue at Brown University and Schwartz at the Neurosciences Institute in La Jolla are experimenting with multiple electrodes placed in the motor cortex of monkeys. And at the neuroscience meeting this week, Richard Andersen's group at the California Institute of Technology in Pasadena described similar experiments in the parietal cortex, which helps transform sensory information into plans for movements.

    They all share the same goal: to use neural firing information recorded in real time to reconstruct the motions of the monkeys' arms. If the scientists can interpret the direction, speed, and distance of arm movement from the corresponding neural signals, then they should be able to use the nerve-firing pattern to direct the movements of a mechanical arm.

    The hope is to “generate natural-looking arm movements, like reaching toward a glass, or gesturing,” says Schwartz. Toward that goal, Matt Fellows and Liam Paninsky from Donoghue's team also reported at the meeting that they have recorded neuron firing in the motor cortex as a monkey used its hand to follow a randomly wandering cursor across a screen. From the firing patterns, the researchers derived a mathematical code that could reconstruct the path of arbitrary arm movements the monkey made later.

    That all these variables could be predicted “based on the information you can get out of fewer than a couple of dozen cells,” Donoghue says, “means the potential for recreating the entirety of any movement is much better than I would have expected.” His team has already used neural information recorded from a monkey to direct the movement of a prosthetic arm to one of eight points in space, and it plans to go on to more natural, less constrained movements.

    Schwartz and his collaborators at Arizona State have just achieved that goal. At a neural prosthesis workshop at the National Institutes of Health 2 weeks ago, Schwartz reported that they have operated a robotic arm directly from recordings of 26 neurons in a monkey's brain, mirroring the animal's own arm as it pushed buttons on a panel or chased an almond across a tabletop. The neural activity provided velocity and direction of movement, which the researchers fed every 20 milliseconds into the computer controlling the robot. The computer determined which joints to move, and how much. The result, says Schwartz, is “three-dimensional, natural arm movement through free space.”

    In these experiments, the monkeys went about their business unaware of the robotic arm. The next step will be to have the animals consciously controlling the arm, much as a paralyzed person might.

    The eventual performance of these devices in people will be aided, researchers expect, by the brain's well-proven plasticity in adapting to new tasks. That plasticity suggests that implantable prostheses need to be placed only in the general vicinity of the correct brain region—such as the motor cortex—rather than linked up to the very neurons devoted to particular movements, a correspondence that may be impossible to achieve in paralyzed patients. Then the brain should take over, learning to run the prosthesis with whatever neurons have been linked to it, much as neighboring brain areas assume new tasks after a stroke.

    Another hurdle to surmount is how to incorporate sensory feedback, such as pressure from touching an object, into a prosthesis. However, robotics research has shown that people get “an amazing amount” of useful feedback solely from what they are seeing, says Schwartz, who believes this will not be difficult to overcome. A stickier problem, he says, will be getting enough neural information to control fine hand motions. But even movements crude enough to grasp a spoon could make a big difference in a paralyzed patient's life. “If they could feed themselves, that would be huge,” Schwartz says.

    Scientists are optimistic that the engineering will carry the day, now that neuroscience has shown that the brain has the potential to control prostheses directly. If so, these scientific advances could soon be transformed into improvements in the lives of paralysis victims. “In principle, it should all be possible,” says neuroscientist Eberhard Fetz of the University of Washington, Seattle. “The question is to what degree it will become practical.” But with the new surge in interest and funding, Chapin predicts that goals “that looked like they might be 15 to 20 years off will suddenly be 5 to 10 years off.”


    A Long Season Puts Çatalhöyük in Context

    1. Michael Balter

    A marathon dig at one of the world's most famous Neolithic villages links it to other, older Near Eastern settlements

    ÇatalhöYüK, Turkey—Archaeologist Mirjana Stevanovic is still wondering about the skulls. Last summer, while excavating at this 9500-year-old Neolithic village near the modern city of Konya, Stevanovic and colleagues from the University of California (UC), Berkeley, found two skulls—with no skeletons attached—in a large house. One was that of a boy about 12 years old; the other belonged to a woman in her late 20s. They had been placed on the floor with their foreheads touching. The skulls were discovered very near the spot where, in 1998, the team found evidence that the roof had collapsed.

    Did the woman and boy die in the cave-in? Were they mother and child? Both skulls have unusual suture patterns in their bones, possibly lending support to the notion that they were related, says Basak Boz, an independent anthropologist from Ankara who is working with the team. And other clues at the site suggest that family ties were very important to these villagers.

    But no one can say for sure why the skulls were placed together. “All I know,” says Stevanovic, “is that they were put that way deliberately.”

    The skulls are just one more puzzle at Çatalhöyük, the largest Neolithic community yet discovered, which offers a rare glimpse of early settled life. But, although the secret of the skulls may never be revealed, the international team that has been excavating here for the past 6 years is beginning to decipher other mysteries of this ancient settlement. A marathon, half-year digging season that ended earlier this month yielded new dates, suggesting that Çatalhöyük is not quite as old as the 10,000 years archaeologists had estimated—and certainly not the “first city” it has sometimes been called. This revised chronology, together with other clues, links the settlement with the cultural tradition of others of the time in the Near East, says excavation director Ian Hodder of Stanford University. But Çatalhöyük remains unique, with up to 10,000 inhabitants crowded together for reasons that remain murky. “Çatalhöyük pushed the idea of a village to its logical absurdity,” Hodder says.

    First excavated in the early 1960s by the British archaeologist James Mellaart, Çatalhöyük was once thought to be a textbook case of an advanced agricultural settlement. Its people lived in mud-brick houses packed so closely together that they had to enter through holes in the roofs. They painted vivid hunting scenes on their plaster walls, buried their dead under the floors, and during at least 1000 years of occupation rebuilt houses one on top of another, creating a mound 20 meters high.

    Many of those startling finds still stand. But when a new group of archaeologists, led by Hodder, started excavating here again in the early 1990s, it began rewriting the Çatalhöyük textbook. These researchers found only rudimentary agriculture and little evidence of what Mellaart claimed was worship of a Mother Goddess (Science, 20 November 1998, p. 1442).

    This year's long dig, the equivalent of three or four normal seasons, was originally prompted by a falling water table—caused by intense crop irrigation nearby—that was threatening to dry out the fragile mud-brick structures. But the lengthy fieldwork was also an opportunity to do a lot of excavating, and its fruits are allowing Hodder's team to rewrite the text once again. The team deepened and broadened an area on the south edge of the mound that Mellaart had dug earlier, and it continued excavations in a separate building on the north side, where the skulls were found. The archaeologists had thought that several meters of remains might lie beneath the lowest levels reached by Mellaart, representing as much as 1000 years of even earlier occupation. But long before the end of the season, they hit clay marl. They realized that in the deepest areas Mellaart reached, he had been only 20 centimeters above virgin soil.

    Although earlier occupation levels might still lie below the center of the mound, this unexpected finding, combined with new radiocarbon dates putting Mellaart's lowest levels at about 7500 B.C., suggests that the earliest settlement at Çatalhöyük is later than that of a number of other Neolithic settlements in the Near East. For example, Jericho, about 700 kilometers away in Palestine, is at least 1000 years older. That puts Çatalhöyük “fairly late in the overall Near Eastern sequence,” Hodder says.

    A younger age helps make sense of another of this season's surprising discoveries: At the lowest levels in the Mellaart area, excavators found chunks of burnt lime, a substance common at many Neolithic sites in the Near East, including Jericho and 'Ain Ghazal in Jordan. That's because the plaster used in the mud-brick walls and floors characteristic of these sites was made from limestone, which was heated to at least 750°C and then mixed with water.

    Çatalhöyük's younger age might explain why all of the plaster found so far on walls and floors has been based on clay, which is less durable than lime, and why the loose bits of burnt lime were found only in the earliest levels. Hodder and other members of the team, including micromorphologist Wendy Matthews of the British Institute of Archaeology in Ankara, suspect that lime plaster was used during the settlement's very first days but later given up because its manufacture required too much fuel. Indeed, some archaeologists have speculated that the fairly rapid decline of 'Ain Ghazal and other settlements in the Levant about 8000 years ago might be partly due to the ravages their populations inflicted on the local environment in search of wood to burn.

    The burnt lime at Çatalhöyük suggests that the village was influenced by cultural traditions first developed to the south and east, even if the people later switched building techniques. “People came here with a long history,” says Christine Hastorf, an archaeobotanist at UC Berkeley.

    Even so, Hodder thinks that the community was not settled by migrants but was founded by a small indigenous population, then slowly grew in size. That view is supported by new evidence that the first Çatalhöyük settlers built more dispersed houses; only later, as the settlement became larger, did the honeycombed housing pattern made famous by archaeology textbooks emerge.

    With this year's marathon digging season at an end, the team plans to limit excavations over the next 2 years so it can study the new finds. They are hoping to find clues to many unanswered questions, chief among them why Çatalhöyük grew so large, and why the villagers lived so closely together rather than spreading out over time. Other Neolithic sites are also mounds or tells, as people constructed new buildings over the remains of old ones, but Çatalhöyük's tightly packed, layered houses are an extreme example. There's no evidence of warfare, and Hodder doesn't think this arrangement was for defense. Instead, he speculates that family ties spanning generations held the community together: People built their houses over the patch of land claimed by their ancestors.

    The touching skulls—as well as elaborate child burials found this year, such as infants packed into finely woven baskets—may offer additional signs of the affection Çatalhöyük parents felt for their children. But the high child mortality rate indicates that life for Çatalhöyük's younger members was hard. The eye-socket bones of most children found here are unusually porous—a sign of anemia and likely due to malnutrition, says Boz. Hodder speculates that child labor might have been essential for keeping the community viable: “Childhood as we understand it today probably was not relevant back then.”

    Although this year's excavation season seems to have taken Çatalhöyük out of the firmament of archaeological stardom and placed it more squarely on Near Eastern soil, Çatalhöyük's extraordinary paintings and artifacts do not fit perfectly into any existing Neolithic tradition. And, unlike other large villages and towns in the Near East, which began to show signs of centralization and hierarchy once they grew to a certain size, Hodder says that Çatalhöyük shows no evidence of public buildings or division of labor and seems to have lived on as an “egalitarian village,” despite its dramatic growth. “It is part of a broader trend, and yet remains unique. That is what's so extraordinary about it.”


    Gamma Ray Bursts Keep Playing Coy

    1. Govert Schilling*
    1. Govert Schilling is an astronomy writer in Utrecht, the Netherlands.

    Huntsville, Alabama—Since 1991, astronomers have met here every 2 years to puzzle over gamma ray bursts, mysterious flashes of energy that may come from the most powerful explosions since the big bang. These events have surrendered many clues to their location and nature over the past couple of years. At the 5th Huntsville Gamma Ray Burst Symposium, held 19 to 22 October, they dropped another hint or two—and another puzzle.

    “X-ray Flashes” Puzzle Astronomers

    It may sound like a feeble joke, but astronomers say they have discovered a new kind of gamma ray burst: one without gamma rays. At the symposium last week, astronomer John Heise of the Space Research Organization Netherlands announced that the Dutch-Italian satellite BeppoSAX has spotted strange outbursts of x-rays that look exactly like the byproducts of a gamma ray burst—they just don't come with gamma rays.

    A mystery for decades, gamma ray bursts have recently been linked to some of the most energetic events in the cosmos: vast supernova explosions that spawn black holes in the far reaches of the universe (Science, 15 October, p. 395). Invisible from Earth because the atmosphere absorbs gamma radiation, they are spotted about once a day by BeppoSAX and other Earth-orbiting satellites. BeppoSAX also detects flashes of x-rays, which have accompanied every gamma burst in the field of view of its x-ray cameras so far, and which also emanate from astronomical objects such as flare stars and x-ray novae. Now, Heise is reporting still another kind of x-ray burst.

    Sifting through the BeppoSAX data, Heise found nine x-ray explosions that resemble the x-ray signal from normal gamma ray bursts in every respect—they last tens of seconds, are scattered randomly across the sky, and have comparable spectra—except that no gamma rays were seen. “These are certainly not normal flare stars or other known objects,” says Heise. “It could be a completely new class of events.”

    But the “x-ray flashes,” as Heise prefers to call them, could be related to gamma ray bursts after all, says Stan Woosley of the University of California, Santa Cruz. Woosley has developed theoretical models for what he calls “dirty” gamma ray bursts: If a cosmic explosion happens in a place where the interstellar gas is relatively dense, the gamma rays could easily be absorbed by the gas while some x-rays escape, he says. However, Woosley's model predicts a different kind of x-ray spectrum from the ones BeppoSAX has measured.

    Quick follow-up observations made immediately after a burst to look for an “afterglow” at other wavelengths, including optical and radio, could reveal the true nature of Heise's mysterious flashes. And those could come quickly: Immediately after Heise's announcement, Luigi Piro of the Istituto di Astrofisica Spaziale in Rome, the mission scientist for BeppoSAX, decided that x-ray flashes will from now on get the same treatment as gamma ray bursts: BeppoSAX's narrow-field x-ray cameras will pinpoint their exact location as soon as possible, to tell optical and radio telescopes where to look for an afterglow.

    A Yardstick for Gamma Ray Bursts?

    Astronomers may have found a simple way to measure the distance to gamma ray bursts, mysterious explosions that occur about once every day in the far corners of the universe. A study presented at the symposium showed that there is a neat correlation between the amount of “flickering” in a burst and its intensity—a correlation that, if it holds up, would tell astronomers the distance of any new burst quickly.

    When a satellite detects a gamma ray burst, astronomers can tell the direction it came from, but not its distance. To do that, they must immediately train ground-based telescopes on the burst's afterglow. From the redshift—a measure of the amount of cosmic expansion since the radiation was emitted—they can determine how far away it occurred. So far, that has been done for only six gamma ray bursts; the distance of the 2500-plus other observed bursts is anyone's guess.

    But those six bursts, say Enrico Ramirez-Ruiz and Ed Fenimore of Los Alamos National Laboratory in New Mexico, show a striking relationship: When the astronomers corrected for distance to determine the intrinsic brightness of the bursts, they found that the brightest flickered the most, while less luminous ones were much smoother. If this relation holds for all gamma ray bursts, it will be easy to determine their distances. The amount of flicker indicates the true brightness, and comparing this with the observed brightness gives the distance.

    “It's an amazing find,” says Don Lamb of the University of Chicago. “The correlation looks awfully good.” Knowing the distances of thousands of gamma ray bursts should reveal how early in the history of the universe these explosions occurred, says Lamb. If gamma ray bursts are caused by collapsing supermassive stars, as some popular models suggest, this would tell astronomers how early star formation began.

    A team led by Jay Norris of NASA's Goddard Space Flight Center in Greenbelt, Maryland, sees evidence for another luminosity indicator. These investigators note that in most bursts, the highest energy gamma ray photons arrive slightly earlier than the lower energy ones. According to Norris, for the six bursts with known distances, this spectral lag is smallest for the most luminous bursts and larger for the fainter ones. Both the flickering and the spectral lag might reflect conditions in a jet of matter that, in some scenarios, spews out from the explosion at nearly the speed of light, speculates Ramirez-Ruiz. He adds: “It smells like physics. It should tell us something about the progenitors” of gamma ray bursts.

    Astronomers should know within a year or so whether these distance indicators hold up. For example, says Lamb, if another half-dozen bursts with measured redshifts show the neat relation between flickering and brightness, astronomers can be confident that they've found a new cosmic yardstick.


    Sweden Takes Steps to Protect Tissue Banks

    1. Annika Nilsson,
    2. Joanna Rose*
    1. Nilsson and Rose are writers in Stockholm.

    There is growing interest among genetics researchers in Sweden's wealth of public tissue banks. Authorities are moving quickly to protect patient privacy

    Stockholm—Sweden and some other Nordic countries are sitting on a genomic gold mine. Their long-standing public health care systems have been quietly stockpiling unique collections of human tissue, some going back for decades. These banks hold samples of blood, sperm, fertilized eggs, umbilical cord blood, and biopsies taken during cancer treatment or medical examinations. The samples were originally stored for possible therapeutic or diagnostic uses for the patients themselves, but researchers now realize that they could contain valuable information about inherited traits that may make people susceptible to a variety of diseases. The Huddinge Hospital south of Stockholm, for example, has been storing blood samples from each newborn child in Sweden since the mid-1970s—a valuable resource for researchers needing population-wide genetic data.

    When biotech companies recently started inquiring about gaining access to both samples and patient information in these banks, however, Swedish medical authorities became increasingly concerned about protecting the privacy of those who had donated the tissues. Those concerns were heightened by allegations earlier this year of a potential breach of privacy at one tissue bank. As a result, the Swedish Medical Research Council issued a new set of ethical guidelines in June that require potential users of banked tissues to acquire informed consent for every new use and place stricter safeguards on patient information. But because these guidelines are not binding on private companies that might want to set up their own tissue banks, the government has ordered the National Board of Health and Welfare to work out a proposal for a new law regulating the creation and use of tissue banks. Danish and Norwegian authorities are also looking into new legal frameworks.

    The Swedish public and politicians really caught on to the privacy issue in April, when the tabloid newspaper Aftonbladet ran a series of articles reporting that some samples and information had already been passed on to a private company through collaboration with a researcher in Uppsala University's pathology department, which maintained a tissue bank. The newspaper also reported that the ethical committees at the hospital had not been informed that the researcher was also on the company's payroll as a consultant. This material indirectly gave the Uppsala-based company, Eurona Medical, access to the codes connecting patients to samples.

    Even before the details of that arrangement hit the press, the Swedish Medical Research Council, along with several other bodies responsible for research and medical ethics, had been trying to alert politicians to the need for tighter regulations. The new ethical guidelines they have now drawn up will govern all use of tissue banks in public hospitals and research facilities, regardless of whether the proposed research is privately or publicly funded, and will be overseen by local ethics committees, which themselves come under the umbrella of the Medical Research Council's ethical advisory board.

    The guidelines stipulate that the local ethical committee must approve each withdrawal, and each new use of the material should require informed consent from the donor. Gisela Dahlquist, chair of the research council's ethical advisory board, says many people seem to believe that once a blood sample and informed consent for research is collected, they are free to use the blood for any purpose. “We want to put a stop to this idea. One cannot give informed consent to something when one doesn't know what it is,” she says. In addition, all tissue banks must have strict rules for how samples are stored. “The most important step is that the code key [connecting samples to individuals] shall be kept by a public agency and that private companies should only be allowed to handle unidentifiable material,” says clinical physiologist Lennart Kaijser of Huddinge Hospital, who is secretary of the council's ethical advisory board.

    As a model of how public tissue banks should interact with the biotech industry, researchers point to the company UmanGenomics of Umeå. Created in 1998 by Umeå University in collaboration with regional health care authorities, the company has exclusive rights to commercial use of Umeå's Medical Bank of blood samples. Since 1985, the bank has been collecting samples from almost the entire population of the isolated region of Västerbotten around their 30th, 40th, and 50th birthdays. Today it contains 100,000 samples from 60,000 individuals who have also been interviewed about their health and lifestyle. This archive can give unique insights into the environmental factors behind cardiovascular disease, often in connection with genealogical data on the patients.

    Although UmanGenomics has the commercial right to use the blood bank, it remains under public authority and can still be used by university researchers. “UmanGenomics constitutes a unique model for this type of company, where the blood bank is legally separated from handling the genetic information with an ethical body in between,” says Sune Rosell, director of UmanGenomics. He also points out that public involvement is ensured in three ways: All bank withdrawals have to be approved by the regional ethical committee; university and public representatives are members of the company board; and informed consent is required for each new use of the blood, even if it means advertising in the press to track down donors who are out of touch.

    Ethical bodies hope the legislative proposals being worked out by the Health and Welfare Board will extend such requirements to every private company. The board is expected to complete its work next May. Dahlquist says clear rules are important to ensure that researchers and pharmaceutical companies can properly make full use of the tissue banks: “It's important that the banks can be used for valuable research and still maintain the confidence of the public.” The advantages of openness have been shown by the Umeå case: Only three people have declined to participate.


    Grad Students Head to Class as New NSF Teaching Fellows

    1. Jeffrey Mervis

    Thirty-one universities are participating in a $13 million, high-profile project to help students, teachers, and would-be scientists

    As a third-year graduate student in organic chemistry at the University of Kansas, Lawrence, Donald Probst is well on his way to becoming a professor or an industrial chemist. But on Tuesdays, Probst takes a detour from his career path, driving 40 minutes along Interstate 70 to spend the day at an inner-city high school in Kansas City, Kansas. He teaches chemistry to students hampered by inadequate preparation and high absenteeism. He also “tutors” the regular teacher, a biologist by training who was suddenly asked to take on chemistry and introductory physical science classes.

    But the students and faculty at Wyandott High School aren't the only ones who are learning. Probst hopes that his year in the classroom will help him hone his own skills explaining his work to the public.

    Probst's adventure is funded by a new program at the National Science Foundation (NSF). Last month, Kansas became one of 31 universities to receive 2- to 3-year grants from NSF's Graduate Teaching Fellows in K-12 Education (GK-12) program, a favorite of NSF director Rita Colwell. The program, which gives 1-year stipends and tuition support to graduate students and advanced undergrads, supplements the agency's existing research fellowships and assistantships. Its aim is to raise student achievement, upgrade classroom teachers' skills, elevate the importance of teaching in the research community, and create a cadre of science communicators.

    “We want to send a message that [teaching in the public schools] is part of what scientists should do,” says mathematician Dennis DeTurck of the University of Pennsylvania, which will receive $1.5 million over 3 years. But, although nobody disputes the fellows' potential contribution to U.S. elementary and secondary math and science instruction, some educators say that the program's expansive goals and the variety of approaches it is supporting will make its long-term impact extremely hard to measure.

    GK-12 is on a fast track. Colwell, who launched the program barely 6 months after joining the $3.9 billion agency in 1998, rarely gives a speech without mentioning this initiative. She told Congress last month that she hopes to more than triple the $13.4-million-a-year program by 2001. Although universities had a scant 90 days to draw up proposals for this year's awards, 157 applications were submitted, and the quality was so high that NSF nearly doubled the sum of money it initially planned to spend.

    The winning institutions, which have promised to involve more than 1200 would-be scientists, are taking widely different approaches. At Kansas, Probst and five colleagues from other graduate departments are already spending one full day each week at Wyandott High School as “apprentices,” explains chemistry professor and project director Janet Robinson, a former high school chemistry teacher. This winter Robinson, who chose the fellows, hopes to add eight undergraduates to the mix.

    In contrast, the University of Rhode Island (URI) is in the midst of a stiff competition for its 12 slots. The winners, who must be graduate students, will receive a semester of pedagogical training, followed by summer workshops involving the teacher with whom the fellows will be paired in the 2000–01 academic year. “I'm not going to throw them into the classroom until they're ready,” says URI's Gail Scowcroft, associate director of marine and environmental education and a former marine biologist.

    Penn's approach falls between those two extremes. DeTurck has signed up 11 graduate students and 8 undergrads and has put them through 6 weeks of examining classroom materials and readying projects before sending them into the classroom for a range of activities. A project at Lea Elementary School, for example, will send fellows and students into the neighborhood to survey existing foliage and calculate where to plant trees. For math instruction, “we're concentrating on grades 3 to 5 because that's where the wheels start to fall off [in math achievement],” says DeTurck. In science, the focus will be on supporting new teachers in grades 4 and 8.

    Most of the GK-12 winners have already received grants under existing NSF education reform projects. “We're not novices,” says Robinson, who notes that her university and Kansas State recently received a $2.5 million teaching training grant from NSF to work with several local schools. “We know the needs of the district and what works.”

    The tremendous variation across programs may make it hard to draw lessons about what works, however. “If NSF really wants to learn the best way to improve science education and communication skills through the use of graduate fellows in the classroom, it would be helpful to have common benchmarks [for measuring success],” says Scowcroft, who will work with an outside evaluator assigned by NSF. “We don't have any now. I don't even know who else got money.” The program's vast scope, encompassing school-age children, graduate students, and classroom teachers, muddies the waters further, notes Texas A&M University mathematics professor David Sanchez, former chair of an advisory panel to NSF's education directorate. “What do you want to accomplish, and who are you trying to change?” he asks, noting that each audience poses a special challenge.

    Probst isn't worried about the big picture. He says the GK-12 fellowship opens up a new world outside a graduate student's usual roles in the lab and the college classroom. Although he'll sacrifice a bit of time in the lab on his research project, which focuses on potential pharmaceutical uses of a class of sulfur compounds, he thinks the trade-off will be worth it. “Scientists have a hard time explaining to people what they do because it's often so technical,” he says. “I think that's an important skill to learn. I'm also hoping to see that look of open-eyed wonder when a student understands something that I've said and says, ‘Hey, that's neat.’”

Log in to view full text

Log in through your institution

Log in through your institution