News this Week

Science  27 Jun 2003:
Vol. 300, Issue 5628, pp. 2012
  1. AFFIRMATIVE ACTION

    Careful Use of Race Is OK, High Court Tells Colleges

    1. Constance Holden

    U.S. universities are hailing this week's rulings by the Supreme Court on the value of race-conscious admissions as a vote of confidence in their efforts to increase diversity. But although the court upheld the principle of taking race into consideration in college admissions, it rejected a numerical system at the University of Michigan that gives a substantial advantage to minority candidates. As a result, many universities may be forced to spend more time weighing the qualifications of each applicant.

    “We are elated. This is a huge victory for higher education,” says Michigan's president, Mary Sue Coleman. “I know all universities are breathing a great sigh of relief today.”

    Plaintiffs' groups are not happy. “Universities have been given a green light to continue the corrupt practice of favoring some applicants and disfavoring others because of their race,” says Bradford Wilson, director of the National Association of Scholars in Princeton, New Jersey. The case stirred unprecedented interest, with amicus curiae briefs filed by hundreds of educational organizations as well as large corporations and members of the military.

    The rulings (http://www.supremecourtus.gov/) come in two 1997 cases brought by white students challenging Michigan's admissions policies, one involving its undergraduate college and the other its law school. In its first pronouncement on the subject since an equally high-profile case that split the court in 1978, Regents of the University of California v. Bakke, the high court came down firmly in favor of “diversity” as the legal rationale for taking race into account in university admissions. Before that, proponents of affirmative action had often argued that it was intended to remedy past discrimination against African Americans and other minorities.

    The court voted 5-4 on Monday to uphold the law school's multifaceted approach to achieving a “critical mass.” Its vote to overturn Michigan's mechanistic system of adding 20 points to a candidate's application (100 is required for admission) based solely on race was 6-3. Although the justices split on Michigan's practices, says Alex Dreier, an attorney at the Washington, D.C., firm of Hogan & Hartson who helped write the amicus brief for the American Council on Education, the opinions are consistent with each other.

    The wait is over.

    Michigan's Mary Sue Coleman says universities “are breathing a great sigh of relief” after court rulings.

    CREDIT: ALEX WONG/GETTY IMAGES

    The law school opinion, authored by Justice Sandra Day O'Connor, said that the Equal Protection Clause of the U.S. Constitution permits the school's “narrowly tailored use of race in admissions decisions to further a compelling interest.” But the point system used for undergraduate admissions is “not narrowly tailored,” Chief Justice William Rehnquist wrote in the second decision, because it makes race a “decisive” factor in many minority admissions. That form of scorekeeping exceeds the use of race as a “plus” in the context of “individualized consideration” of applicants, the court explained.

    Coleman says it won't be a problem for Michigan to replace the point system with a more individualized assessment. “We may have to hire some more people, but we're happy to do that,” she says.

    Some schools have had considerable success doing exactly that. Rice University in Houston takes what mathematics professor Richard Tapia calls a “holistic approach” to every applicant. “We read the entire folder. Sure it takes time, but it's the only way to really get to know a student.” Tapia practices what he preaches: Since 1998, eight of the 23 Ph.D.s awarded in the mathematics department have gone to underrepresented minorities (Mexican Americans, African Americans, and Native Americans). Wilson says such an individualized approach, however, “presents a huge administrative challenge” to large and selective places such as Michigan “that use race preferences to overcome the credentials gap between whites and Asians, on the one hand, and blacks and Hispanics on the other.”

    A statement from 28 higher education groups that backed Michigan's policies praises the court for “[upholding] racial and ethnic diversity as a compelling state interest [and reaffirming] the importance of giving colleges and universities leeway in the admissions process.” Coleman says she believes that “this overturns Hopwood [a 1996 federal circuit court ruling that outlawed race-sensitive admissions in Texas, Mississippi, and Louisiana], so Texas can go back to using affirmative action.” Indeed, University of Texas President Larry Faulkner says he expects the school to factor in race in future admissions and financial aid decisions. Wilson sees a different future, however, forecasting “more litigation and, given the unpopularity of racial discrimination, perhaps more popular initiatives like Proposition 209 in California,” which outlawed affirmative action in state higher education.

    From his perspective at a “Hopwood” university, Tapia occupies the middle ground in interpreting the court's latest ruling. “Simple doesn't work when it comes to judging an applicant's ability to be successful,” he says. “That's what I've always believed, and now that's what the court seems to be saying.”

  2. ENVIRONMENTAL INDICATORS

    EPA Report Takes Heat for Climate Change Edits

    1. Erik Stokstad
    1. With reporting by Jocelyn Kaiser.

    The Draft Report on the Environment turned out to be a fitting swansong for Christine Todd Whitman, the departing head of the Environmental Protection Agency (EPA) who's been at odds with the White House but is credited with boosting science at the agency (Science, 30 May, p. 1351). Released on 23 June, the report sums up EPA's view of the most solid scientific measurements of environmental quality across the United States. But late last week, The New York Times reported that the White House had demanded changes to the scientific discussion of climate change in the document. Newspaper editorials and environmentalists railed about what they saw as EPA's loss of independence. Scientists familiar with the report say the episode is a shame, as it detracts from what appears to be a valuable baseline on the state of the U.S. environment.

    Similar to a study issued last year by the H. John Heinz III Center for Science, Economics and the Environment (Science, 27 September 2002, p. 2191), the draft report* identifies “indicators” that can be used to gauge nationwide trends in environmental quality. The Heinz report focused exclusively on the state of ecosystems. EPA expanded the scope to include human health and potentially harmful “pressures,” such as air pollution. It asked 18 scientists to review possible indicators, settling upon 181, such as ozone levels over North America and blood levels of mercury.

    Just the facts.

    Whitman says report is based on science, not policy.

    CREDIT: STEVEN SENNE/AP

    Like the Heinz study, the EPA report concludes that that data for many indicators aren't available for the entire nation or are limited in time. “We can't make grand statements [about the state of the environment] based on data,” says Robin O'Malley, senior fellow and program director at the Heinz Center. EPA plans to collect data to fill in the gaps. O'Malley, like others, says that the data that EPA does include appear robust. “As far as I can see, they scrubbed the data pretty hard.”

    EPA scientists weren't the only ones scrubbing, though. According to an internal EPA white paper obtained by the National Wildlife Federation, the White House Council on Environmental Quality and the Office of Management and Budget made major edits to a section of the report dealing with climate change. They deleted a summary of scientific consensus and discussions of impacts on ecosystems and human health; the White House also emphasized uncertainty that EPA thought was scientifically unwarranted, according to the white paper. Ultimately, EPA decided to omit the section rather than “misrepresent” the science, the memo says. At a press conference, research chief and science adviser Paul Gilman defended the decision, saying that climate change was a small part of the report and will be handled by the White House's Climate Change Research Initiative (Science, 13 July 2001, p. 199).

    The White House involvement cast an unfortunate shadow, some observers say. “There is a danger that the great bulk of useful information in the report will somehow be tainted in people's eyes,” says Pat Kinney, an epidemiologist at Columbia University.

    EPA will collect comments on the draft. It's also forming a science advisory panel for its indicators initiative. Says Anthony Janetos of the Heinz Center: “I'm sure they'll get a lot of advice on climate change.”

  3. SMALLPOX IMMUNIZATION

    Panel Urges Caution Over Heart Problems

    1. Jennifer Couzin

    Concerned by cardiac problems among a small number of smallpox vaccine recipients, an advisory panel last week recommended that the Centers for Disease Control and Prevention (CDC) in Atlanta postpone the planned expansion of its vaccine effort. The recommendation of the Advisory Committee on Immunization Practices (ACIP) dealt another blow to the civilian program, announced by President George W. Bush on 13 December and already stalled because of a dearth of volunteers (Science, 9 May, p. 880). CDC says it will weigh the recommendation but for now intends to forge ahead.

    Previous experience with the so-called Dryvax vaccine, used routinely in the 1960s and early 1970s, had officials worried about several serious vaccine-related effects, including encephalitis. These have been rare. But among the 37,000 civilians and 450,000 military workers immunized with the stockpiled vaccine this time around, at least 21 and 37, respectively, have developed nonfatal inflammation of the heart; three other vaccine recipients suffered fatal heart attacks, which have not been definitively linked to vaccines. The inflammation is most commonly myocarditis (inflammation of heart tissue), although cases of pericarditis (inflammation of tissue surrounding the heart) have also been reported.

    Vaccine woes.

    Smallpox immunizations have been linked to heart inflammation

    CREDIT: CDC.

    “These were totally unanticipated on the basis of past experience,” says John Modlin, a pediatric infectious-disease specialist at Dartmouth Medical School in Hanover, New Hampshire, and chair of ACIP.

    ACIP is particularly troubled about the timing of the cardiac inflammation cases. Most appeared 1 to 2 weeks after vaccination, when the live virus in the vaccine, called vaccinia, hits its peak replication. “The implication is that this is a vaccinia infection of the heart,” Modlin worries.

    In some of the first hard evidence to date, a paper in this week's Journal of the American Medical Association reports on 18 myopericarditis cases in the military, including biopsy findings from one of them. Heart tissue from that individual revealed infiltration of white blood cells called eosinophils—which are “not the typical virus-fighting cells,” says co-author Gregory Poland, head of the Mayo Clinic's vaccine research group in Rochester, Minnesota. He wonders whether it's not vaccinia but some other component of the vaccine that's prompting inflammation, perhaps bovine cells from the manufacturing process.

    Myopericarditis was reported in Europe following smallpox immunization several decades ago; those scattered cases were attributed to the more virulent vaccinia used. But until now cardiac inflammation has rarely been linked with U.S. use of the vaccine. One theory for its sudden appearance is that modern diagnostics make it easier to find. Adults might also be more susceptible to vaccine-related cardiac inflammation than children, the bulk of those vaccinated in the past, and may complain more readily about their symptoms. Because most who suffer cardiac inflammation recover on their own, cases in children could have flown under the radar.

    Acambis, a British-American company, is conducting clinical trials on a new smallpox vaccine produced by modern cell culture techniques. So far, the studies have been too small to document cardiac effects, if they exist. Barring any problems, the U.S. government plans to add doses of the Acambis vaccine to its existing stockpile.

    Given these uncertainties, ACIP doesn't discourage states from finishing the early phase of vaccination—inoculating the “first responders”—but urges against scaling up to include others, such as police and firefighters. CDC hasn't determined how it will proceed. “We're still moving forward,” says CDC spokesperson Von Roebuck.

    The question of whether to broaden the program, though, may be moot. The number of volunteers has slowed to roughly 100 people a week, say CDC officials, making it unlikely that the agency will reach its original goal of up to 10 million vaccinees anytime soon.

  4. BIOTECHNOLOGY

    Sheep Fail to Produce Golden Fleece

    1. Gretchen Vogel

    The company that helped clone Dolly the sheep is fighting for survival now that a key project has been put on ice. PPL Therapeutics announced on 18 June that a partner, the German drug giant Bayer, has suspended development of a treatment for lung ailments called recombinant alpha-1 antitrypsin (AAT), one of PPL's two main potential products. In response, PPL says it will lay off up to 140 of its 165 employees and will scrap plans for a $70 million manufacturing plant.

    The drastic downsizing leaves PPL with an uncertain future. Last week the company, based in Midlothian, Scotland, told shareholders that it is mulling three options: press ahead with bringing to market its other main product, a blood-clotting agent called Fibrin 1; find a buyer; or liquidate its assets and return the cash to shareholders. PPL has hired the consulting firm KPMG to advise it on the best way forward. So far, however, the reaction from Wall Street has been less than supportive: PPL's stock price fell to $10.20 per share after the announcement, valuing the company at $12 million. That's $3 million less than it has in the bank.

    Founded in 1987 to commercialize research at the government-funded Roslin Institute, PPL became world famous in 1997 when its scientists were part of the team that created Dolly, the first mammal cloned from a mature cell. PPL researchers were also the first to clone pigs, and they have been at the vanguard of technology for creating transgenic sheep, pigs, and cattle.

    Uncertain future.

    The fate of PPL's 4000-strong New Zealand flock, engineered to make the protein AAT, is undecided.

    CREDIT: PPL THERAPEUTICS

    All of that scientific prowess failed to translate into business success. The company had hoped to turn a profit by harvesting hard-to-manufacture proteins such as AAT from the milk of transgenic animals. But the stock market downturn hit the company at a vulnerable time, says biotech analyst Julie Simmonds of London investment banking company Evolution Beeson Gregory. “They didn't go to commercial production quite fast enough,” she says, and a funding drought limited the company's ability to capitalize on its lab successes.

    Although PPL claims that early clinical trials of AAT were going well, spokesperson Philip Dennis acknowledges that lingering questions about the purity of AAT produced in sheep milk and a few minor but unexplained side effects—such as mild coughing—suggested to Bayer that it would be too expensive to gamble on the drug. It is unclear what will happen to PPL's 4000-strong flock of AAT-producing transgenic sheep in New Zealand.

    The Bayer blow is the latest in a series of setbacks for PPL. In November, Alan Coleman, a leader of the team that cloned Dolly, left to join ES Cell International, a company in Singapore that specializes in human embryonic stem cells. And last April, PPL spun off its U.S. branch—now named Regenecor—which focuses on regenerative medicine, including the development of transgenic animals to serve as human tissue donors.

    PPL executives believe they have enough money in the bank to bring Fibrin 1 to market by 2006, Dennis says. Company leaders say it will be easier to win regulatory approval for the product because it can be classified as a medical device rather than a drug. But the calculations will leave PPL running on vapors toward that checkered flag. “It's a very risky option,” says Simmonds. “The amount of cash they have is just enough to make it. If anything unexpected happens, I don't think the market will be in the mood to bail them out.” PPL is expected to have the KPMG report in hand by late summer, after which it will announce whether it will be Fibrin or bust.

  5. COMPUTER SCIENCE

    Scientists Launch Global Internet Research Lab

    1. Robert F. Service

    Planting the seeds for tomorrow's Internet, a group of researchers announced this week that it's launched a new virtual Internet research laboratory. Called PlanetLab, the collaboration consists of researchers from 60 universities in 16 countries as well as from computer giants Intel and Hewlett-Packard. The new partnership aims to build a proving ground for applications that could enable the Internet to monitor itself for viruses and worms, recall Web pages long after they've disappeared, and develop other powerful new capabilities.

    Research networks have come and gone before. But earlier efforts either were targeted to single applications or were wide-ranging but small, involving no more than a few dozen computers. PlanetLab, by contrast, aims to test a broad spectrum of applications with a network of 1000 machines. “That scale is interesting,” says Robert Kahn, president and CEO of the Corporation for National Research Initiatives in Reston, Virginia, who co-wrote some of the original computer protocols that led to the development of the Internet. “No one has tried something at that scale before.”

    Today's Internet is little more than a common language, or protocol, that allows computer users to quickly send each other information. When someone clicks on a Web page, computer software in the sender's machine sends out a stream of data, which is later broken into a series of data packets. These are transmitted via a network of machines called “routers” to the recipient's computer, where software reassembles it into a Web page. The process makes the Internet available to anyone with a computer, a modem, a phone line, and a standard software package. That simplicity proved so successful that the number of Internet users surged from about 16 million in December 1995 to more than 600 million in September 2002.

    Web's sites.

    PlanetLab's Internet research network will start with 170 computers based at 65 sites. The network is expected to grow to 1000 computers in 2 years.

    SOURCE: PLANETLAB

    But according to Larry Peterson, an Internet researcher at Princeton University and head of PlanetLab, the Internet's vastness now stymies most efforts to innovate the Web itself. Researchers looking to create novel applications have no way to test them in an environment that is both easily monitored and big enough to give real-world results. “There is a very high barrier to entry,” Peterson says. “How do you get [new applications] across to 1000 machines?” Adds another PlanetLab founder, David Culler of the University of California, Berkeley, “This is a concern that has been growing throughout our community.”

    Last year Culler, Peterson, and others started a grassroots effort to create a new research testbed. Their idea was to link computers into an “overlay” network using current Internet connections, much as the Internet itself started as an overlay on top of the telephone network. Each computer in the new network will contain software enabling it to share storage and processing power with other computers in the network. Participating research teams will then receive a “slice” of the network's time and resources to run their application. All the linked computers will be used both to run the applications and to monitor the network itself. “We can see where congestion is and poke and probe the network to discover where there are failures,” says Rick McGeer, a computer scientist at Hewlett-Packard Research Labs in Berkeley.

    The idea took hold quickly. Last summer, Intel officials agreed to donate an initial 100 computers and to house many at Intel sites around the globe. Culler and Peterson then put the word out to their colleagues and began signing them up. And this April Hewlett-Packard officials came on board, donating another 30 machines. PlanetLab “has generated an enormous amount of enthusiasm from our community,” McGeer says.

    One unknown is whether government funding agencies will join in. Peterson says he and others have submitted a pair of grant proposals to the U.S. National Science Foundation for $18 million to cover costs of both the network and research for 4 years. The decisions are expected any day.

  6. ATMOSPHERIC SCIENCES

    NSF Steps Over Political Hurdles to Build Mobile Radar Facility

    1. Jeffrey Mervis

    Space weather researchers were crestfallen in 1998 when a powerful legislator blocked a long-planned upper atmospheric radar facility in Canada's Arctic region. Senator Ted Stevens (R-AK), chair of the Senate Appropriations Committee, wanted the facility—called the Polar Cap Observatory (PCO)—built on U.S. soil, preferably in his home state of Alaska (Science, 8 May 1998, p. 820). But the National Science Foundation (NSF), which had requested $25 million for the project, insisted the best place to put it was a desolate Canadian site only 700 kilometers from the magnetic north pole, an area of intense interaction between the solar wind and Earth's ionosphere.

    That impasse killed PCO. But the scientists, some of whom had been working on the idea for more than a decade, didn't give up. And late last month their persistence was rewarded: The National Science Board (NSB), NSF's oversight body, gave the green light to a reworked proposal that reconciles both scientific and political imperatives, and there's no sign of congressional resistance. “We expect it to be a cornerstone of observational research in the upper atmospheric sciences,” says Jarvis Moyers, head of the NSF division that will foot the bill. It's also the first of what is expected to be dozens of mid-sized research facilities that NSF plans to build in the coming decade to satisfy what a recent NSB report calls a pressing national need for such tools.

    Best face forward.

    A prototype showing two of the 128 panels that make up each face of the modular radar facility.

    CREDIT: SRI INTERNATIONAL

    The new and improved instrument, called the Advanced Modular Incoherent Scatter Radar (AMISR), will feature a lightweight, electronically steerable bank of transmitters and antennas. It will be possible to ship anywhere in the world and operate remotely. The PCO, in contrast, was to be a massive, mechanically steerable single-faced array built and housed at weather-challenged Resolute Bay. AMISR will consist of three faces, each comprising 128 panels and measuring 32 meters square, that will be deployable in unison or separately. It will be three times more sensitive than PCO, and it will be possible to turn its pulsed signals on and off instantly from a remote site.

    AMISR will cost more—$44 million compared with $25 million for PCO. But its portability, combined with the technological improvements since PCO was conceived, will make a huge difference in what AMISR can accomplish. “We will be able to do more science than we could have done with a fixed system and more than we ever imagined,” says Cornell University electrical engineer Michael Kelley, who edited a 1990 report that called PCO “The next step in upper atmospheric science.” Adds John Kelly of SRI International in Menlo Park, California, which is leading a consortium that will build the instruments, “The last time a radar station was moved [from Alaska to Greenland], it took a year to dissemble and required a blowtorch.”

    PCO was designed to monitor a swath of atmosphere 90 to 1000 km above the magnetic north pole, where Earth's magnetic field funnels the energy of the solar wind. The energy can arrive in sudden bursts as well as in a steady flow. When it slams into Earth's upper atmosphere, it creates violent disturbances that may alter weather patterns and disrupt electronic communications, knock out electrical power grids, and wreak havoc with other terrestrial activities. Incoherent scatter radar sends intense beams that can measure the composition, temperature, density, and motion of the ionosphere and upper atmosphere. AMISR can be configured to take such observation at any latitude.

    The first face will be built and deployed over the next 18 months in central Alaska, some 2500 km from the magnetic pole, at a scientific sounding rocket launch site operated by the University of Alaska, Fairbanks, that is already heavily instrumented for upper atmospheric studies. The second and third faces will be assembled and tested at Resolute Bay, followed by a 3-year observation program.

    A scientific panel will determine where to place the panels based on the merit of the proposal, and the competition is expected to be hot. “There could be some passionate arguments,” says Kelley. “You could be following a line along the magnetic equator and think, ‘If only I knew what was happening at the end of that [magnetic] field line,’” he says. “Now we'll be able to stick a face there and get the answer.”

    NSF has already funded some $7 million in development work. But the project has kept a low profile since SRI submitted its revised proposal in April 2001. Congressional aides, for example, first learned about AMISR only 2 weeks before the science board gave its final approval. “We kept it fuzzy on purpose, with only one line in the '03 budget, because we were still doing testing,” says Moyers.

    NSF officials announced the good news about AMISR at last week's annual meeting in Colorado of an ongoing NSF-funded program called Coupling, Energetics, and Dynamics of Atmospheric Regions. “There was a lot of celebrating,” says Kelley, whose original report was vetted at the group's 1990 meeting.

    AMISR grew out of a similar brainstorming session after Stevens put the kibosh on PCO. “We were pretty bummed out,” Kelley admits. But the legislative defeat turned out to have a silver lining. “I would never want to say that the senator did us a favor,” he says. “But this instrument can do so much more.”

  7. NANOTECHNOLOGY

    Sorting Technique May Boost Nanotube Research

    1. Robert F. Service

    Ever since a Japanese researcher discovered in 1991 that atomic-thin sheets of carbon atoms in graphite could roll up into tiny nanosized straws, carbon nanotubes have awed and frustrated researchers. Nanotubes' ability to conduct electricity either like metals or like semiconductors, depending on their precise atomic arrangement, has made them the darlings of molecular electronics and led to their incorporation into a wide range of nanosized building blocks of computer circuitry. But the method used to produce nanotubes—zapping graphite with either a laser or an electric jolt—creates a jumble of metallic and semiconducting tubes, and so far there has been no way to separate them. Now a new scheme may greatly reduce nanotube researchers' headaches.

    In a paper published online by Science this week (www.sciencemag.org/cgi/content/abstract/1086534), physicists Ralph Krupke and Hilbert von Löhneysen and chemists Frank Hennrich and Manfred Kappes of the University of Karlsruhe in Germany report a simple electrical technique to divide semiconducting and metallic nanotubes. According to chemist Richard Smalley of Rice University in Houston, the technique is long overdue and will find immediate use. “A lot of us believe that the time has come for us to get better control of what nanotubes we're playing with,” Smalley says. “I'm impressed they have done it.” But for now both Smalley and Kappes note that the technique works only with extremely small amounts of nanotubes and needs to be scaled up for most industrial uses.

    Kappes and colleagues discovered the sorting method while looking for new ways to wire up tubes between pairs of electrodes. Like most molecular electronics researchers, they were working with single-walled nanotubes (SWNTs) made up of a single rolled up graphite layer. SWNTs have more consistent electronic properties than the more complex multilayer tubes, but they are tricky to work with. Because their long sidewalls strongly attract one another, they typically emerge from production chambers as bundles of 50 or more tubes that resemble handfuls of drinking straws.

    Close call.

    A new technique separates metallic nanotubes (top) from their semiconducting counterparts (bottom) by using an electric field to draw the more conductive metallic tubes to an electrode.

    CREDITS: V. CRESPI/PENN STATE UNIVERSITY

    Lacking a simple way to separate bundles of tubes into individual ones, Kappes and colleagues set out to use entire bundles of tubes as wires between pairs of electrodes. But last year Smalley's group reported a new scheme to bombard nanotubes in a solvent with ultrasound to create a stable suspension of individual SWNTs (Science, 26 July 2002, p. 593). The method still produced a mix of semiconducting and metallic tubes, but it gave the Karlsruhe team an idea for a new separation technique.

    The challenge in separating SWNTs is that semiconducting and metallic versions are nearly identical. Both are made up only of carbon atoms and thus lack unique chemical handles to grab on to. So the Karlsruhe team opted to use their different electronic properties to sort them. They knew that when placed in an electric field both metals and semiconductors can be induced to form “dipoles,” with more positive charges at one end and more negative charges at the other. A nanotube's negatively charged electrons are attracted to the end of a tube closest to the positive electrode, whereas positive charges on the tube tend to pile up closest to the negative electrode.

    In a direct-current electric field, both semiconducting and metallic SWNTs can be induced to have dipoles. But differences emerge in the presence of an alternating current. In this case, the charge in pairs of electrodes is switched back and forth millions of times a second. With each switch, electrons whisk back and forth through the tubes. The key is that electrons whip through metallic tubes but merely plod through semiconducting ones. As a result, the metallic nanotubes become polarized much faster than their semiconducting brethren do and thus wind up with much stronger induced dipoles. Because the polarity of both the electrodes and the nanotubes themselves flips with each switch, the metallic tubes get pulled more quickly to the nearest electrode. By taking advantage of this difference, the researchers were able to separate mixtures of tubes in just minutes.

    So far, Kappes says, the new technique separates out just picograms of nanotubes at a time. That is enough to be useful for molecular electronics researchers, but it won't create new industries dependent on just one tube type. Kappes says he and his colleagues are already working to scale up the process. If they succeed, nanotechnologists will likely be in for plenty of more awe.

  8. APPROPRIATIONS

    House Bill Signals the End of NIH's Double-Digit Growth

    1. Jocelyn Kaiser

    It may be crunch time for the National Institutes of Health (NIH), judging by Congress's first action on the 2004 biomedical budget. NIH has seen a series of double-digit increases over the past 5 years, but a House of Representatives appropriations subcommittee last week marked up a bill that would give NIH a rise of only 2.5%—to $27.7 billion—signaling an abrupt halt to the expansion. Biomedical research advocates are hoping that the Senate, which begins work on its own bill this week, will scrape up more money.

    Since 1999, NIH has enjoyed increases of roughly 15% each year, part of a campaign to double its budget. But the House often started with a low number; this year it has recommended a raise of only $681 million, essentially mirroring President George W. Bush's budget request. In percentage terms, the House panel's proposal is the lowest in many years.

    SOURCE: NIH

    Both the House subcommittee and the White House argue that the impact on science might not be harsh, however, because research would actually go up about 7%. To make this happen, the House panel would curtail $1.04 billion in current spending on construction and other one-time expenses, freeing up money for research grants. Still, the growth would be narrowly channeled: Much of the increase is slated for bioterrorism research, allowing few grants beyond this year's number for other research, notes Steven Teitelbaum, president of the Federation of American Societies for Experimental Biology. This looks like “less than one new nonbiodefense grant per institute,” says Teitelbaum. “It's very much a crash landing.”

    NIH supporters have said that the agency needs an increase of at least 8.5% to maintain gains from the doubling. If growth were cut to 2.5%, “it would be a disaster. … It would leave us with all momentum lost and no opportunity to fund new grants,” says former Illinois Representative John Porter (R), now at the law firm Hogan & Hartson in Washington, D.C. Porter spearheaded the doubling campaign when he chaired the House Appropriations Subcommittee on Labor, Health and Human Services, and Education in the late 1990s. He now worries that a retrenchment could send “a terrible message to young researchers.” Some House Democrats also decried the meager increase.

    The numbers could change as the bill advances, although the full appropriations committee and House generally follow the subcommittee's lead. The subcommittee had little room to maneuver because it was given a small total budget allocation. NIH supporters are hoping to do better when the Senate Labor-HHS-Education appropriations subcommittee marks up its bill, scheduled for this week. In past years, the Senate has given NIH a larger increase than the House, and the increase has often made it into law.

  9. PUBLIC HEALTH

    French Supreme Court Ends Tainted Blood Saga

    1. Barbara Casassus*
    1. Barbara Casassus is a writer in Paris.

    PARIS—A legal odyssey involving the contamination of France's blood supply with the AIDS virus in the mid-1980s appears to have ground to a conclusion. Last week, the Supreme Court (Cour de Cassation) upheld a ruling that 30 health officials, doctors, and political aides cannot be held responsible for delays in blood screening that resulted in thousands of people becoming infected with HIV.

    The decision is a huge relief for the defendants, including a prominent AIDS researcher, Jean-Baptiste Brunet—who colleagues felt had been charged unfairly with “complicity in poisoning”—and François Gros, a well-known cell biologist and former secretary of the Academy of Sciences who was the government's chief medical adviser when the scandal broke. More broadly, the decision could have major consequences for health litigation in France.

    Most of the defendants had been involved in government decisions in early 1985 that, prosecutors alleged, were designed to keep an HIV-antibody test manufactured by the U.S. firm Abbott off the market while France's Diagnostics Pasteur readied its own test. (Brunet had been accused of sitting on data showing that a small percentage of blood taken from Parisian blood donors was contaminated with HIV, a charge he vehemently denied.) In the months before France began screening blood systematically for the AIDS virus in October 1985, more than 4000 people are thought to have become infected—including about 2400 transfusion recipients and 1300 hemophiliacs—all of whom have received compensation from the state, says Jean-Philippe Duhamel, a Supreme Court attorney for the Hemophiliacs Association. The Supreme Court concluded that the defendants were not guilty of complicity to poison because there was no proof that they intended to harm people or that they even knew that the blood stocks were contaminated. The court added that delays over the introduction of the Abbott test did not prevent it from being used in some transfusion centers; it also noted that there were doubts about Abbott's capacity to supply the French market and about the test's reliability.

    The ruling could affect another case in which several scientists at the Pasteur Institute and elsewhere are under criminal investigation for their roles in preparing human growth hormone apparently containing aberrant prion proteins implicated in the fatal brain-wasting condition, Creutzfeldt-Jakob disease. The tainted hormone stocks, derived from cadavers, were prescribed to children in 1984 and 1985, before France switched to safer recombinant growth hormone. But the Supreme Court decision suggests that case is tenuous, argues Duhamel: In the HIV case, “there is no doubt that the blood stocks were contaminated, whereas it is only probable in the growth hormone case.”

    Some observers complain that in spite of the intense publicity the HIV case has generated, France is still ill-prepared for the next hidden bloodborne infection. “The authorities have drawn no lessons from the case,” asserts Luc Montagnier, leader of the Pasteur team that first isolated the HIV virus in the early 1980s. “The authorities never react until there is a catastrophe, so we could well have another blood scandal in the years ahead if no further action is taken.” Montagnier, who now heads the Foundation for AIDS Research and Prevention, has called for much more aggressive screening of blood for a range of infectious agents.

  10. NEUROSCIENCE

    The Puzzling Portrait of a Pore

    1. Greg Miller

    A 6-year effort to solve one of neurobiology's toughest mysteries has yielded a provocative solution and sparked a debate among ion channel researchers

    Last month a team of neuroscientists announced a long-awaited breakthrough—an exquisitely detailed portrait of a type of protein that is crucial for the generation of nerve impulses, the blips of electrical activity that underlie all movement, sensation, and thought. The team, led by Roderick MacKinnon of Rockefeller University in New York City, drew on the new image, produced by x-ray crystallography, and a number of physiological studies to propose a simple and elegant model for the workings of the protein, a voltage-gated ion channel. These molecular gatekeepers determine when ions are allowed to pass across a cell membrane.

    Unfortunately, however, the model contradicts the widely accepted view of how the channel works, which is based on decades of solid research. And although everybody agrees that deriving the new structure was a tremendous technical feat, many of MacKinnon's colleagues say the final product must be flawed. “There are an enormous number of reasons to think there's something wrong with the structure,” says Richard Horn of Thomas Jefferson University in Philadelphia.

    MacKinnon says he, too, was uneasy about the structure at first. But he became convinced by a series of physiological studies that support the new model. He suggests that those who dismiss the model out of hand are too attached to an old paradigm to accept the new evidence. “The eye only sees what the mind already knows,” he says. The controversy has roiled this normally calm corner of neuroscience and has many labs working overtime on experiments that put the surprising model to the test.

    Tiny tunnels

    More than half a century ago, physiologists Alan Hodgkin and Andrew Huxley, in a set of experiments that won them the Nobel Prize, discovered that nerve impulses are generated by the flux of ions across the lipid membrane of a neuron. As with many great insights, this one generated plenty of questions. Hodgkin and Huxley showed that in order for neurons to work, the right ions have to cross the membrane at the right time. But how? Charged molecules cross lipid membranes about as easily as people sprint up Mount Everest: The amount of energy required makes it virtually impossible. Tiny ion channels, researchers later learned, do the work of a thousand sherpas, allowing ions to skip through the membrane with ease.

    For ion channels in electrically active nerve and muscle cells, opening at the right time means responding to changes in the voltage across the cell membrane. This process, called voltage gating, has been the focus of intense research, and in recent years, the field seemed to be coming to something like a consensus—researchers still quibbled over details, but a rough picture of the general mechanism seemed to be emerging.

    Many of the key experiments were done with voltage-gated potassium channels (so named because only potassium ions slip through when they open). These channels are responsible for bringing a nerve impulse to an end so a neuron can prepare to fire again. Without them, the electrical pulses that are the basic units of neural communication would be reduced to a meaningless buzz.

    Competing theories.

    The standard model (top) of voltage-gated potassium channels holds that pistons open (left) or close (right) the passage. In a new model (bottom), a paddle opens the channel (left) by sweeping toward the external surface of the cell.

    ILLUSTRATION: C. SLAYDEN

    Voltage-gated potassium channels are highly conserved from the simplest organisms to the most complex. Most have four identical subunits, each containing six loosely linked helical segments. In what MacKinnon now calls the “conventional” view (many of his colleagues bristle at this label, which they feel is a dismissive characterization of the leading hypothesis), all six helices are oriented more or less perpendicularly to the lipid membrane. The fifth and sixth segments (S5 and S6) of each of the four subunits face the center and together form the pore through which ions pass.

    The fourth segment (S4) is widely thought to be the voltage sensor. This segment contains a number of positively charged amino acids. The conventional view of how S4 contributes to voltage gating is somewhat fuzzy, but the general idea goes like this: The interior of a resting neuron is electrically negative relative to the fluid surrounding it. In that state, the positively charged S4 segment, mostly surrounded by watery crevices, is drawn toward the inside of the cell and the channel is closed. During a nerve impulse, the inside of the neuron rapidly becomes more positive, exerting less pull and even repulsing S4's positive charges, allowing the segment to shift toward the outside of the cell. This movement somehow pulls on the other segments of the channel to open the pore and allow positively charged potassium ions to rush out of the neuron, restoring the interior to its negative resting condition.

    “There is a large body of data that comes from a wide variety of approaches—biochemical, electrophysiological, different kinds of chemical modification approaches—that all lead to [this] picture of the structure of the channel,” says Diane Papazian, an ion channel researcher at the University of California, Los Angeles (UCLA). The research, done in about a dozen labs, “led to a very coherent picture of how [voltage-gating] worked.”

    Sweeping change

    That picture has been turned inside out—literally—by two papers published in Nature on 1 May. They provide the most direct look yet at a voltage-gated potassium channel. MacKinnon's lab distinguished itself in 1998 by producing the first-ever crystal structure of a membrane-bound ion channel (Science, 3 April 1998, pp. 69 and 106), work that won MacKinnon kudos from colleagues and a share of the prestigious Lasker Prize the following year. The lab has subsequently produced three more ion channel structures, including the new one.

    The latest effort took nearly 6 years. The difficulty stemmed from the fact that the voltage sensing region of the channel protein is a moving part, says Youxing Jiang, the study's first author, now at the University of Texas Southwestern Medical Center in Dallas. This flexibility makes the protein exceedingly tough to coax into the neat, compact, latticelike crystals required for crystallography. The team tested five or six different voltage-gated potassium channels in a variety of conditions that encourage crystallization, all to no avail, Jiang says.

    A breakthrough came when the group turned to a channel found in Aeropyrum pernix, an archaebacterium that dwells in 95°C water inside hydrothermal vents. If the organism can withstand that kind of heat, the team reasoned, its voltage-gated potassium channel, called KvAP, must have some built-in stability.

    To stabilize the channel further, the researchers attached tiny bits of antibody molecules. These bits latched onto specific places on the KvAP protein and formed a scaffolding that held it in place. Even using this approach, the scientists struggled to get solidly packed crystals, so they decided to try to solve the structure of just the domain of the channel—segments S1 to S4—that includes the voltage-sensing region. This structure initially seemed to fit with the conventional model: It suggested that the S4 region would fit neatly in the core of the channel, MacKinnon says. “We were ready to publish it.” That was in early 2001.

    Ready for my close-up.

    Antibodies (green) hold the channel in place; it's seen here from the inside of a cell looking out.

    CREDIT: Y. JIANG ET AL., NATURE 423, 33 (2003)

    But a series of physiological experiments expected to confirm S4's position didn't turn out as predicted. “We couldn't convince ourselves that it fits the conventional model, even though we came into it with that bias,” MacKinnon says. So they held off on publishing the work and redoubled their efforts to get the entire channel.

    When success came at long last and the team translated the crystallography data into a molecular model, they were shocked, MacKinnon says. “Our first response was: ‘What is this? This is crazy!’” They weren't the only ones. “I nearly fell out of my chair,” says Kenton Swartz of the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland. “The KvAP structure is pretty bizarre.”

    Its most conspicuous feature is the position of the S4 segments. Rather than lining up inside the channel parallel to the other segments, they are arranged like spokes around the periphery of the channel, in the plane of the membrane.

    Instead of traveling with modest pistonlike or screwlike movements through the core of the channel as most researchers had envisioned, the MacKinnon group proposes that the S3 and S4 segments form a paddle that moves like a lever arm through the lipid membrane surrounding the channel, sweeping from the intracellular side to the extracellular side of the membrane. This movement somehow—as with the conventional model, the details aren't known—pulls on the other segments of the channel to open the pore. The work “certainly creates a paradigm shift in terms of how we interpret gating,” says Robert Blaustein of Tufts-New England Medical Center in Boston.

    But many researchers aren't ready to embrace that shift just yet. Putting hydrophilic molecules such as the charged residues on the S4 segment in a hydrophobic environment like a lipid membrane is “energetically unrealistic,” says Robert Guy, a biophysicist at the National Cancer Institute in Bethesda. Lipid membranes are a “forbidden zone” for charged molecules, says Fred Sigworth, a physiologist at Yale University. “Everyone thought [the S4 segments] were embedded in the core of the protein where there are countercharges to keep them electrostatically happy.”

    And the charged residues don't just sit there: According to MacKinnon's calculations, they move a whopping 20 angstroms through the membrane when the channel opens. That's substantially more than any version of the conventional model, and some researchers think that the movement would require an unrealistic amount of energy.

    MacKinnon counters that not enough is known about the energetics to say whether the movement is realistic or not. “It's an interesting energetic problem,” he says, but he adds that he can envision a number of scenarios that would make the movement of charges less difficult than it might appear at first glance.

    Another troublesome feature of the model concerns a region of the protein known as a glycosylation site. Glycosylation sites are places where sugar molecules are added to newly made proteins to mark them for transport to the membrane. They always end up on the extracellular side of the membrane, but in MacKinnon's model of KvAP, a glycosylation site between segments S1 and S2 falls squarely in the middle of the membrane.

    Many researchers also point out that the model contradicts several “accessibility” studies, including work done in MacKinnon's own lab. These studies test which parts of the channel can be grabbed by other molecules from the inner or outer surface of the membrane as the channel opens and closes. For example, as a postdoc in MacKinnon's lab, Swartz did a series of accessibility studies using a tarantula toxin called hanatoxin and a voltage-gated potassium channel from rats. He found that hanatoxin binds to the S3 segment when applied to the extracellular but not intracellular surface, regardless of whether the channel is open or closed. That suggested that this region of the channel is always near the outside of the membrane—a result that appears to contradict MacKinnon's vision of an S3-S4 paddle that sweeps from one surface of the membrane to the other.

    And the list goes on. “There are all kinds of things about the paddle that bother me,” says Horn. He and others cite additional accessibility studies, genetic mutation experiments, and work done with fluorescent markers to track the movements of particular regions of the channel as examples of work that doesn't jibe with the paddle model. Francisco Bezanilla, a colleague of Diane Papazian's, says the two of them, UCLA colleague Ana Correa, and their respective labs sat down one day and came up with a list of 19 experimental results they couldn't explain if MacKinnon's model is correct.

    Many researchers think that the discrepancies between MacKinnon's model and previous work arise from the extreme measures the team had to take to crystallize the channel protein. “The conditions apparently ripped [the channel] apart in some way,” says Horn. He and others suspect that the antibody fragments are to blame. “The antibodies made the floppy structure go into a state that it doesn't normally go into in the membrane, like opening a flower,” suggests Bezanilla.

    MacKinnon acknowledges that the structure presented in the paper almost certainly isn't what the channel looks like in a living membrane. “S1 and S2 are clearly pulled in a funny position,” he says, which could explain the odd location of the glycosylation site and perhaps other discrepancies. And the S3-S4 paddle probably isn't as close to the inner surface of the membrane, even in the closed position, as it appears in the model, he says.

    But he firmly defends the key feature of the model—the paddle that sweeps through the membrane in response to voltage changes. The key evidence, he says, is a series of accessibility studies with the KvAP channel indicating that S3-S4 is accessible from only the inside of the membrane when the channel is closed and only from the outside when it's open.

    At first glance, these experiments fly in the face of the earlier hanatoxin experiments, which suggested that the S3-S4 linker is always accessible from the outer surface of the membrane. MacKinnon thinks that the new experiments are more trustworthy because they involve attaching a bulky molecule to the channel. Hanatoxin is smaller, and its size or some other feature might enable it to sneak into the membrane and bind the voltage sensor when the channel is closed, MacKinnon speculates, although he admits that the two sets of experiments are hard to reconcile. Even so, he says, “I don't see this as a fundamental inconsistency, I see this as something we don't understand.”

    Many researchers are now working on experiments—from refined versions of the earlier accessibility studies to computer modeling studies—designed to put the new model to the test. The first results have begun to roll in, and several researchers say they plan to submit manuscripts over the summer.

    MacKinnon's lab is hard at work on two more crystal structures of the KvAP protein. Both will use antibody fragments that attach to different regions of the channel than do those used in the first study. Comparing all three structures should allow the team to determine which features of the channel are real and which have been distorted by the antibody scaffold.

    Regardless of how things turn out, even those who doubt MacKinnon's model agree that the new structure is a substantial contribution to the field. “I think the crystal structure may be the biggest break in understanding voltage gating since Hodgkin and Huxley,” says Guy, who is among the fiercest critics of the new model. And the study has certainly energized the field. “I can't think of another time when there have been so many experiments just screaming to be done,” says Swartz. Settling the issue will take some time. Meanwhile, if you're looking for some excitement, consider dropping by the next biophysics meeting that comes to town.

  11. GREEN CHEMISTRY

    Shy Chemicals Offer a Solution

    1. David Bradley*
    1. David Bradley is a writer in Cambridge, U.K.

    Sporting fluorine atoms where standard organic compounds have hydrogen, new touch-me-not solvents and catalysts bid to help industrial chemistry clean up its act

    Chlorine-based solvents have a terrible public image as ozone eaters. Under the Montreal Protocol, many chlorine-containing compounds, including several dry-cleaning agents, are banned internationally. Now fluorine, which sits just above chlorine in the Periodic Table, may be set to redeem the family name.

    Researchers are touting the potential for fluorine-based chemicals to speed up important industrial reactions, conserve catalysts and solvents, and use less energy in the process. “The unique properties of fluorine atoms … offer unparalleled opportunities for chemists to speed up and clean up their acts,” says Dennis Curran of the University of Pittsburgh.

    Fluorine's rise as a green chemical began a decade ago in the Corporate Research Laboratories of Exxon Research and Engineering Co. István T. Horváth (now at Eötvös Loránd University in Budapest, Hungary) and colleagues were searching for a better way to oxidize methane to methanol, a major feedstock for the production of fuels and chemicals. They never found it, but in the course of their work they hit on something potentially much more useful. Trying to improve on the widely used industrial process for converting alkenes into useful materials, they began to investigate using fluorine-containing solvents instead of familiar organic solvents such as aliphatic hydrocarbons, ethers, and amines. Their experiments showed that these previously little-investigated “fluorous” solvents simply would not mix with the conventional solvents in their reactions.

    The new fluorous solvents sport C-F bonds where standard solvents have C-H or C-Cl bonds. Like their forebears, they readily dissolve small molecules such as carbon monoxide and hydrogen. But in other ways they are dramatically different. For one thing, fluorine's high electronegativity, or affinity for electrons, makes them reluctant to mix with conventional solvents at room temperature. “They are like liquid Teflon: Organic and inorganic things don't stick,” says Curran, an early fluorous-chemistry enthusiast.

    Escape artist.

    Dennis Curran and other aficionados hope to harness fluorine-rich chemicals' quirky unmixing ability.

    CREDIT: LYNNE CLEMENTE/UNIVERSITY OF PITTSBURGH

    That chemical standoffishness, Horváth realized, might help chemists solve some longstanding problems. At the top of the list is catalysis. Chemists use two main kinds of catalysts: homogeneous ones, which begin their reactions dissolved in the same solution as the starting materials, and heterogeneous ones, which are bound to solid particles that mingle with the dissolved reactants but remain separate from them. Homogeneous catalysts tend to be highly efficient, but reclaiming them from solution once the reaction is complete often requires environmentally noxious solvents that increase the overall costs of the process. Heterogeneous catalysts, by contrast, can be easily separated from the product, cleaned up, and reused. But they are also less efficient, and the high temperatures and pressures often needed to make them practical add costs and environmental burdens of their own.

    Fluorous catalysts offered a third path. These catalysts have sprouted a fluorous appendage, so they readily dissolve in fluorous solvents. Poured into a solution of conventional reactants, such a fluorous phase remains separate, like oil on water, at room temperature. When warmed, though, the two liquids mingle, bringing the dissolved fluorous catalyst into direct contact with the starting materials, where they can react very effectively. As the mixture cools, the phases separate again. Nonfluorous reaction products remain with the nonfluorous organic phase, so they can easily be tapped off and the catalyst phase recycled (see figure).

    “During reaction, you have homogeneous catalysis. During separation, it's heterogeneous. It's like having your cake and eating it too,” Curran says. Curran is hoping to cash in on that potential: He has created a spin-off company, Fluorous Technologies Inc., that now supplies a growing market for fluorous reagents.

    Other researchers who have taken the ball and run with it include John Gladysz, one of Horváth's early collaborators. Now at the University of Erlangen-Nürnberg, Germany, Gladysz and colleagues have extended fluorous chemistry to workhorses of the chemical industry such as metal-catalyzed hydrogenations, hydrosilylations, and hydroborations—atom-swapping reactions used to produce everything from agrochemicals to pharmaceuticals.

    Gladysz says his team aims to create a “parallel universe” of fluorous chemicals that would react like traditional reagents and catalysts—phosphines, aliphatic amines, pyridines, arenes, and sulfur compounds—but that would readily dissolve in fluorous solvents. Already the group has developed a fluorous version of aryl iodide, an oxidizing agent used throughout the chemical industry. Normally, chemists discard the byproducts of aryl iodide, but the fluorous versions of the chemicals can be recycled.

    Labs around the world are spinning variations on the fluorine theme. Jean-Pierre Bégué, Danièle Bonnet-Delpon, and colleagues at the French national research agency CNRS in Châtenay-Malabry have developed several new fluorinated compounds in which fluorine replaces only about half of the hydrogen atoms in conventional organic compounds. Bégué says they have found that when they use partially fluorinated alcohols such as hexafluoroisopropanol in synthesizing organic compounds, the oxidation reactions are faster, cleaner, and easier to control. What's more, the only byproduct is water, making the reactions even more environmentally friendly. The team hopes to create fluorous counterparts for processes that now rely heavily on toxic, ozone-eating chlorinated solvents.

    Mix masters.

    Fluorous solvents can make catalysts and reaction products easy to retrieve.

    ILLUSTRATION: A. STONEBRAKER/SCIENCE

    Other groups are ringing changes on the number of phases. In recent experiments, Curran and collaborators in Osaka, Japan, led by Ilhyong Ryu of Osaka Prefecture University in Sakai have sandwiched a fluorous solvent between two otherwise miscible organic solvents—one heavier and one lighter than the fluorous phase. Different starting materials dissolved in each organic phase diffuse across the fluorous phase and react on the other side. Using such three-phase systems, the team has carried out a pair of common processes, the bromination of alkenes and the dealkylation of aromatic ethers by boron tribromide.

    This “liquid membrane” approach, Curran says, provides an easy way to keep heat-generating reactions under control. Instead of using energy-consuming cooling systems or machines that slow the mixing of the reaction ingredients, chemical engineers can control reaction rates simply by changing the thickness of the fluorous barrier. And because the fluorous reaction components simply have to pass through the fluorous solvent, not dissolve in it as they do in a two-phase system, the chemicals can get by with many fewer fluorine atoms, Curran says. That should make them less damaging should any escape into the environment. Curran sees applications for such reactions in many areas, including the preparation of handed forms of drug molecules.

    Gladysz's team, meanwhile, hopes to eradicate fluorous solvents from fluorous reactions altogether. Building on earlier work by Curran, the researchers have found that many fluorous catalysts work just as well in conventional solvents and can still be recycled as readily. The key to the technique is that many perfluorinated catalysts become markedly more soluble in conventional solvents as the temperature rises. The chemists warm the vessel containing the reagents, feed in the catalyst, and then cool the system after the reaction finishes. As the catalyst becomes less soluble, it drops out of solution to be filtered off. Gladysz and his team are also working to create reactions with fluorous catalysts bound to Teflon shavings, which can simply be filtered off at the end of the process.

    Fluorous chemistry is not a panacea for the chemical industry. Many reaction systems are still off-limits to the approach because no one has made the necessary fluorous reagents or catalysts. Horváth says that problem will solve itself as more researchers start working in the area. Environmental concerns could also cloud the field's future. Although fluorous compounds are safer than their chlorine-based counterparts, they could still cause problems, and many chemists worry that chemically stable, environmentally persistent fluorous solvents could lead to a buildup of troublesome greenhouse gases. But Curran says careful handling and the design of nonvolatile fluorous reagents can minimize those risks. If so, environmentally benign chemical processes could shift from their present patchwork of niches into the industrial mainstream. Fluorine could be green, after all.

  12. CLINICAL RESEARCH

    Climbing a Medical Everest

    1. Alicia Ault*
    1. Alicia Ault is a writer in Kensington, Maryland.

    Inspired by an Oxford epidemiologist, the volunteers of the Cochrane Collaboration are sifting through mountains of data in search of medicine that works

    A decade ago, a small group of epidemiologists and clinicians set out to change medical practice. Their radical notion: Physicians should base treatment decisions on the best available evidence on whether a potential therapy is likely to work. And that evidence, they argued, isn't likely to come from textbooks or a few large, controlled trials. Instead, they reasoned, the best way to see through the mass of data on a specific intervention would be to cull all available studies, give failing marks to any that don't measure up, analyze the rest, and synthesize the results into a single “systematic review.”

    That idea has since blossomed into one of the most ambitious movements in modern medicine. Some 10,000 volunteers around the world are now participating in such reviews, through a loose-knit organization called the Cochrane Collaboration. They have issued 1600 judgments on treatments and procedures, and 1200 more are in the works. They are well on their way to cataloging more than a half-century of clinical trials. The Cochrane is doing yeoman's work, attempting to inject logic into medicine's hodgepodge of wisdom, says Richard Peto, co-director of the Clinical Trial Service Unit and Epidemiological Studies Unit at the University of Oxford, U.K. “The fundamental problem in human knowledge is how on earth to organize it in ways that make it accessible,” says Peto. The Cochrane “really has helped produce order out of chaos, in a world where chaos is looming all the time,” he says.

    But the collaboration has been less successful in achieving its goal of changing clinical practice, at least in the United States. “Clinicians haven't looked closely enough at what Cochrane does,” says epidemiologist Kay Dickersin of Brown University in Providence, Rhode Island, who heads the U.S. branch of the collaboration. And when the collaboration does capture attention, U.S. medicine has turned out to be remarkably resistant. In October 2001, for example, when the Cochrane published a review in The Lancet concluding that mammography screening did not save lives, the U.S. cancer establishment formed a solid front, urging women to ignore the analysis and continue getting mammograms.

    First to enlist.

    Oxford physician Iain Chalmers organized a comprehensive review of pregnancy and childbirth care.

    CREDIT: K. DICKERSIN

    It's a different matter in the United Kingdom, other European countries, and Australia, which are home to most of the army of volunteers who conduct the collaboration's work. Much of the collaboration's funding in the U.K., for example, comes from the National Health Service (NHS), and its conclusions are used to shape national health policy.

    The goal now, say the Cochrane's supporters, is to get the message across to individual physicians. To do that, it is trying to add some management to its freewheeling structure and form a collaboration with a commercial publisher.

    Motley crew

    The Cochrane Collaboration's roots go back to the mid-1970s, when Oxford physician Iain Chalmers assembled a motley crew of volunteers to look at pregnancy and childbirth. They took their name from Archie Cochrane, a British epidemiologist and champion of effectiveness studies who died in 1988. Using a method that became the Cochrane signature, the team scoured 70 medical journals and wrote to some 42,000 obstetricians and gynecologists around the world to gather published and unpublished data on 600 interventions. The group issued a two-volume study, Effective Care in Pregnancy and Childbirth, in 1989.

    The project caught the attention of NHS, which essentially hired the Cochrane to act as an effectiveness research institute, says Chalmers. In 1992, with NHS funding, he and a handful of volunteers opened the very first Cochrane Center, in Oxford. The collaboration went international in October 1993, when it held the first of what have become annual meetings. It is now a loosely knit network of 14 centers—including three in the United States, funded mainly by grants—that support and train the volunteers, who include physicians, statisticians, epidemiologists, nurses, midwives, and even consumers (Science, 5 April 1996, p. 22).

    The Cochrane's army of volunteers has taken on a monumental task. They are hand-searching 2200 medical journals back to 1948 and intend to catalog 1 million published randomized trials. They are one-third of the way toward that goal: The Cochrane Register of Controlled Trials includes 360,000 studies, exceeding the 200,000 in the MedLine database at the U.S. National Library of Medicine, says Dickersin, who joined the effort as a graduate student.

    The Cochrane's core undertaking is systematic reviews of medical interventions, based on detailed rules that basically follow the procedures pioneered in the original obstetrics study. New reviews are published quarterly, and each is supposed to be updated every 2 years.

    Topic selection varies among the 50 review groups, each of which has a particular specialty, such as anesthesia or stroke. Some set priorities, but those priorities vary by country. Developing nations might be most interested in finding the least costly technology, for instance. The reviews are published in abstract form at http://www.cochrane.org/ free of charge. The full contents of the library are published quarterly on the Internet and on CD-ROM. So far, seven countries have paid Cochrane for free nationwide Web access. Authors can submit reviews to medical journals, but those publications put them through their own peer review.

    Reducing harm, maximizing benefit

    There may be no concrete way to measure the Cochrane's impact, but its leaders cite several major projects they think have helped doctors “reduce the harm we do and maximize the benefit,” as Chalmers says. One big success was a 1998 Cochrane review that found that use of human albumin in burn patients not only had no benefit but increased the mortality risk by 6%. After the review's publication, albumin use plummeted by almost half.

    In June 2001, the Royal College of Obstetricians and Gynaecologists issued new guidelines on induction of labor that relied heavily on reviews by the Cochrane's Pregnancy and Childbirth Group. And reviews of smoking cessation and chemotherapy for women with breast cancer also made a big impact in the United Kingdom, says Mike Clarke, director of the U.K. Cochrane Centre.

    The Cochrane also got a lot of attention in the United States for what may seem like a trivial subject: its evaluation of electric toothbrushes, finding that they work better than manual ones. The February 2003 review made print and broadcast headlines around the nation. Another review that month—of 31 psychiatric drug trials covering 4410 subjects—was more striking. A group led by John Geddes of Oxford's Department of Psychiatry found that patients who continued to take antidepressants after an acute episode had a 70% reduction in relapse. “People just weren't expecting such striking results,” says Peto of the review, published in The Lancet.

    But the reviews can also be highly controversial. Take the 2001 mammography study. It came under sharp criticism for excluding some studies that did not meet the collaboration's strict criteria for scientific rigor. The complaints have a long history. The lead authors, Ole Olsen and Peter Gotzsche of the Nordic Cochrane Centre in Copenhagen, had published a mammography review in The Lancet in 2000, concluding that screening did not reduce mortality; it was not an official Cochrane review, but it was still disowned by the Cochrane's Breast Cancer Group for its narrow selection of trial data. The group approved the 2001 Lancet report, but the dispute simmered on. Lancet editor Richard Horton accused the Breast Cancer Group of trying to pressure the Nordic authors into being more supportive of mammography. Cochrane leaders denied it.

    Big picture.

    Massive data analysis can reduce the chaos of medical practice, says Oxford's Richard Peto. Kay Dickersin heads the U.S. Cochrane Center at Brown.

    CREDITS: (TOP TO BOTTOM) OXFORD UNIVERSITY; BROWN UNIVERSITY

    In the end, the mammography review had little official impact in the United States. No organization altered its screening recommendations, even though a National Cancer Institute advisory board recommended that perhaps a change was in order. In Britain, it also prompted a fierce debate, but no policy change.

    Cochrane insiders admit that the process isn't perfect, but they argue that at least it is “transparent.” Authors lay all their cards on the table, describing how they select and weigh trial results. “People can look at how the review was done and decide for themselves” whether the results hold up, says Clarke. If they don't like what they see, their critiques can be added into the database.

    Clarke also argues that the Cochrane reviews present a “truer” picture than typical medical journal reviews, in part because they try to include negative results. Researchers often discuss negative data at meetings but don't publish them. “About half of the studies that reach abstracts never go on to full publication,” says Dickersin. So the Cochrane has begun an ambitious project to accumulate abstracts from conference proceedings that could be incorporated into systematic reviews.

    10,000 reviews to go

    Despite its sometimes idiosyncratic struggles, the Cochrane is pressing ahead. “For a fairly anarchic organization, it's grown very encouragingly over the years,” says a proud Chalmers, who is now retired from the Cochrane. But, he notes, even a mostly volunteer enterprise requires infrastructure and funding to support that infrastructure. Just to summarize the existing medical research, Cochrane will have to conduct 10,000 systematic reviews—a job that Clarke estimates will take until 2012 or 2015.

    As it enters its second decade, the Cochrane has begun to step away from its anarchic roots, adding more management and becoming more businesslike in its pursuits. The group now has paid directors like Clarke and has hired a CEO, Nicholas Royle. He will help Cochrane “raise its profile and engage more with other organizations,” says Clarke. And in April, the Cochrane signed up John Wiley & Sons as its publishing partner, partly to broaden its reach. Wiley will deliver the Cochrane Library online.

    The Cochrane is also fine-tuning its selection of review topics. In its current bottom-up approach, almost any volunteer can, in theory, propose a subject and get a protocol approved. In practice, this means that the particular interests of volunteers might take precedence over pressing public health issues. Chalmers defends this “democratization” of research but admits that without priority setting, important clinical questions might go unanswered.

    Cochrane has started to prioritize reviews by consulting with health departments, patient groups, and health charities, but it would never conduct reviews at the behest of outsiders. The organization seeks to influence, but not to advocate. It's a fine line. “Once you start trying to direct things, you become an advocate as opposed to somebody who produces evidence,” says Drummond Rennie, an editor of the Journal of the American Medical Association. “Cochrane is going to live and die by the evidence it produces and its quality,” he says.

    Dickersin says the Cochrane will survive. “The collaboration is filled with a bunch of self-critical idealists, which can make it pretty rough going at times,” she says. “But it also makes it the unique thing that it is, and it is why it works and has the respect it does.”

  13. PUBLIC ENGAGEMENT

    Bringing Science to the Cafés

    1. Daniel Clery

    Forget the digestive biscuits and clotted cream: Britain's most sumptuous export could soon be a trendy new concept called Café Scientifique

    CAMBRIDGE, U.K.—It's a spring evening in the Borders bookshop here in the center of town, and the coffee bar is packed. A diverse crowd, from students with notebooks on their knees to retired couples out for an evening's entertainment, watches raptly as a slightly rumpled middle-aged academic steps up to the microphone. Tony Minson is not a book author on tour: He's a virologist at the University of Cambridge who's preparing to tell his audience all about “mad cow” disease and the human version, a fatal malady that sends shivers down the spines of most Britons.

    The topic is grisly, but the forum is a lively new way of engaging people in burning issues of science and technology: Café Scientifique, a monthly gathering where members of the public listen to a talk and then participate in open debate. “It's a way of democratizing science,” says Nicole Towler of the British Council's science team. “People value the opportunity to engage with an expert. Anyone's voice can be heard.”

    What began as a hobby for a few enthusiasts 5 years ago has grown into a movement. With the help of a grant from the Wellcome Trust, Britain's mammoth biomedical research charity, more than 20 cafés are either operating or due to start this year. And last month, café organizers from across Europe met in Paris to compare notes and experiences. Indeed, Cafés Scientifiques are blossoming across the globe, from Boston to Copenhagen to Sydney. “We want to internationalize the dialogue about science,” says U.K. coordinator Teresa Anderson.

    In the Cambridge café, the clink of coffee cups and the hiss of espresso machines provide a soothing background as Minson wends his way through the strange history of prion diseases. Britain's cafés follow a tried-and-tested format: The venue is a place where people normally feel relaxed, such as a pub or coffee shop; the speaker should be charismatic; and following a half-hour talk and a break to replenish drinks, the floor is open.

    Cafe society.

    A diverse crowd gathers in this Cambridge coffee bar once a month to discuss “controversial and sexy” developments in science.

    CREDIT: SHWEN GWEE/NAKEDSCIENTISTS.COM

    Duncan Dallas, an independent TV producer, founded Britain's first Café Scientifique in Leeds in 1998 after reading an obituary on Marc Sautet, who started France's Café Philosophique movement in 1992. Dallas wondered if a variation on scientific issues would fly. To find out, he put a sign in the window of his local wine bar and asked his friends to spread the word. About 40 people turned up, and subsequent meetings have continued to pull in the crowds. The key, says Dallas, is “to pick a subject that is a bit controversial and sexy,” such as the rise of antiobiotic-resistant superbugs or whether machines could someday have emotions. “You need to know your audience and build up trust.” Word got around and cafés soon popped up in Nottingham, Newcastle, and Oxford.

    Back at Cambridge, Minson has wrapped up his talk. While many listeners line up for more coffee, a few avail themselves of the masseuse brought in to kneed aching shoulders for a small fee. Snatches of conversation from some students suggest that Minson's talk has struck a chord: They are chatting animatedly about the unpleasant rituals of the South Fore people of New Guinea, who once ate the brains of victims of kuru, a deadly prion disease, and fell victim themselves. Soon the Q&A session begins and many of the questions cause Minson to pause for thought. One interlocutor wonders whether black-and-white cows, often pictured in news photos as mad cow victims, are more susceptible. Not at all an off-the-wall question, pipes up a veterinary student in the audience, who says that some breeds do appear to be at higher risk of contracting the disease.

    In 2001, Dallas and others won a £170,000 grant from the Wellcome Trust that pays Anderson's salary and some publicity expenses for new cafés and for a Web site on event listings and resources (http://www.cafescientifique.org/). The aim is to create 20 new cafés in 3 years. Anderson targets cities that are large enough to provide a good-sized audience and hunts for people with experience in popularizing science. “I network like hell, and someone eventually pops up,” she says. Anderson helps organizers find a venue, arrange the first few speakers, and publicize the events. After that, the groups should run on their own: The organizers do it for fun, the venues are not paid but benefit from the extra customers, and the speaker's expenses are collected by passing a hat.

    In France, a loose network of similar events, also called Bars des Sciences or Zincs des Sciences, sprang up around the same time as the British cafés. The French network held a meeting in Paris last month that was attended by café organizers from across Europe. “We feel there are things we can learn from each other,” says Anderson. Countries have adopted different styles. In Paris, for example, they often have a panel rather than a single speaker, and in Copenhagen they try to pair a scientist and an artist to talk on the same theme.

    This year Café Scientifique went global. The British Council, a government-funded body that promotes U.K. culture and educational opportunities overseas, has spread the word to its offices in 109 countries worldwide. Several so far have run café-style events with British speakers and more are expected soon. Rather than have speakers travel overseas, the council ran a series of pilot cafés early this year using videoconferencing to link a speaker and a facilitator in the U.K. with audiences in Yugoslavia, India, and Malaysia. More such events are in the works. And in February journalists from the British magazine New Scientist established a beachhead in Boston with North America's first café.

    It's getting late in Cambridge and Chris Smith, a London-based physician who helps organize the café, is packing up. He and his co-organizers have been videotaping the events and plan to put them on their Web site (http://www.thenakedscientists.com/). “If you plug into the public interest, then you'll have a good café,” Smith says. Indeed, science debates could be coming soon to a cozy café near you.

  14. JAPAN

    New Rules Shake Up System For Funding Basic Research

    1. Dennis Normile

    Proposals from industrial scientists, university overhead, and professional staffs are some of the changes on the horizon for Japanese science

    TOKYO—When Koichi Tanaka of Shimadzu Corp. won the 2002 Nobel Prize in chemistry last fall, he became an instant celebrity, elbowing politicians and movie stars off the covers of Japan's weekly magazines. The attention lavished on the 43-year-old engineer a career “salaryman” was a welcome tonic for Japan's beaten-down private sector. It also had a direct effect on the country's science policy. By reminding Japanese policy-makers that industrial labs could be wellsprings of new ideas, Tanaka's achievement helped convince the government to give industrial researchers an equal shot at a growing pot of money now reserved for academics.

    The change is part of a package of reformsthe most sweeping in decadesadopted this spring by the Council for Science and Technology Policy (CSTP), the nation's highest science advisory group. Government officials will spend the next year drawing up guidelines to implement the reforms, which promise to revamp how competitive basic research is funded in Japan. The reforms address complaints that academic scientists have long raised about Japan's hidebound research system, but they are likely to increase the competition for basic research funds.

    The reforms cover 26 granting programs across seven government agencies, including everything from basic particle physics and astronomy to applied work on semiconductor materials and solar power. The government has promised to double the overall size of those programs, to $5 billion, over a 5-year period ending in 2006, although the current fiscal crisis has so far limited that growth to $3 billion. The new policies will hopefully make the awarding of grants more transparent and the management of the programs and grant monies more professional, says Motoyuki Ono, director-general of the Japan Society for the Promotion of Science, one of the largest funders of competitive grants.

    Many of the suggested changes build upon earlier reforms. Throughout the 1990s, for example, the government began to support postdoctoral researchers for the first time. Until then, postdocs hadn't existed in Japan. It also allowed some grant recipients to employ technicians. The council's recommendations would extend that hiring authority to graduate students. They would also give grantees greater fiscal flexibility, including carrying over funds into the next year.

    Tanakamania.

    Winning the 2002 Nobel Prize in chemistry made Koichi Tanaka an overnight media celebrity and helped change Japan's funding rules.

    CREDIT: D. NORMILE
    View this table:

    There are new incentives for institutions as well. Universities must now dip into their general funds to pay for overhead expenses. The report recommends that institutions be allowed to claim up to 30% of each grant to cover everything from utilities to libraries. Ono, who served on the policy council's project team, says the intent is to boost the size of the grants to cover the overhead charge, although it's not clear how quickly that will happen. In return, institutions will be expected to provide more administrative support for grant holders.

    Another reform encourages agencies to employ program directors with hands-on expertise in the field they oversee. The council's model is the U.S. National Science Foundation, where many program managers are researchers on loan from a university. Japan has no tradition of university professors going to work directly for funding agencies and instead uses career bureaucrats to run its programs. “We think this is a good idea, but we don't know where we will find the people,” Ono says.

    Tanaka's Nobel rekindled the fire under a long-simmering debate on giving the private sector greater access to public research monies. Providing support for “the next Tanaka” led to the directive to open all competitive grant schemes “to researchers without regard to where they work.” This applies to all funding agencies. But its biggest impact may be felt on the Ministry of Education, Culture, Sports, Science, and Technology's grants-in-aid for scientific research, which provides bread-and-butter support for academic researchers. Corporate researchers already receive government funding, primarily from the Ministry of Economy, Trade, and Industry, for areas such as alternative energy, nanotechnology, and robotics. But they cannot apply for grants in aid, which accounts for about half of the country's total competitive grants funding.

    Japanese companies, whose research budgets have been battered by the stagnant economy, strongly support the change. “Private companies are playing an important role in basic research in Japan,” argues Junichi Sone, general manager of NEC Corp.'s Fundamental Research Laboratories, in Tsukuba. But some academics are uneasy. “The worry is that opening this to corporate researchers will mean less funding for academics,” says Reiko Kuroda, a professor of chemistry at the University of Tokyo and a member of the reform project team. But a CSTP staffer who assisted the reform project team thinks that few private-sector scientists are interested in basic research and even fewer would agree to publish their results quickly, as these grants require.

    Other policy suggestions hinge on broader reforms. The council wants greater independence for younger researchers, for example, as well as increased funding. But at most universities, senior faculty still maintain control over grants won by junior researchers. Ono says he hopes that reforms accompanying the privatization of the national universities, which takes effect next year, will address that issue. The policy council has promised to keep its eye on implementation, which could take several years.

Log in to view full text