News this Week

Science  19 Nov 2004:
Vol. 306, Issue 5700, pp. 1270
  1. INFECTIOUS DISEASES

    WHO Gives a Cautious Green Light to Smallpox Experiments

    1. Gretchen Vogel

    A World Health Organization (WHO) advisory committee has given its blessing to limited genetic manipulation of the smallpox virus. If the recommendation is accepted by WHO director-general Jong-wook Lee and the World Health Assembly next year, it would mark the first time since smallpox was eradicated that scientists would be allowed to genetically modify the virus.

    Once smallpox was wiped out in 1979, the known remaining viral stocks were transferred to two high-security labs at the U.S. Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, and later the VECTOR research center in Koltsovo, Russia. Many involved in the eradication effort pushed for the remaining stocks to be destroyed, but others argued that they should be maintained to allow research on new treatments or safer vaccines in case terrorists or rogue governments have illicit stashes.

    A WHO advisory committee must approve any research done with the remaining virus. In a meeting last week, the members gave their initial permission for three new types of work: insertion of a single marker gene into the virus, transfer of one smallpox gene into a related virus, and distribution of very short fragments of smallpox DNA to labs and companies working on diagnostic tests.

    Scientists at the U.S. Army Medical Research Institute of Infectious Diseases want to insert a gene coding for green fluorescent protein (GFP) into the virus to make it easier to screen for new antiviral drugs. A visual assay would make it possible to automate some of the screening tests, making them both faster and safer, says Riccardo Wittek of the University of Lausanne, Switzerland, who heads the WHO committee.

    Future glow?

    A WHO committee approved plans to insert green fluorescent protein into the smallpox virus, shown here in a false-color image.

    CREDIT: COURTESY OF C. S. GOLDSMITH AND I. DAMON/CDC

    The experiment “has a clear scientific rationale” with little or no chance of accidentally creating a more dangerous virus, agrees molecular biologist Richard Ebright of Rutgers University in Piscataway, New Jersey, who is not a committee member. The review, he says, “is an example of how the process should work.”

    The panel also decided to relax controls over short stretches of smallpox DNA. According to current rules, anyone wanting to obtain any part of the genome has to apply to WHO for permission, but the committee recommended that fragments of up to 500 base pairs should be freely distributed. Such fragments are too short to code for a whole gene but are used as positive controls in diagnostic tests, Wittek explains.

    The committee also decided to permit experiments that would transfer a single smallpox gene—for instance, a gene for DNA polymerase—into a related virus. Scientists have proposed such experiments as a way to test antiviral drugs without using the smallpox virus itself and would focus on replication genes rather than virulence genes, Wittek says. Even so, such experiments are potentially more troubling than those with GFP, says Jonathan Tucker of the Monterey Institute of International Studies' Center for Nonproliferation Studies in Washington, D.C., because the committee said such work could be done in enhanced biosafety level 3 (BSL-3) laboratories outside of CDC and VECTOR instead of the more secure BSL-4 labs. “My concern is that as the research proliferates, WHO does not have the resources to exercise proper oversight,” he says.

    But Wittek says that any lab proposing such work would have to go through an extensive review. He said the committee hoped that its approval would speed efforts to find effective treatments for smallpox—one of the goals cited by those who argued for continued research. “It moves you closer to the day when you can destroy the remaining stocks,” he says.

  2. ITALY

    Academics Protest Plan to End Tenure

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    NAPLES—Italian academics last week rallied outside Italy's higher education ministry in Rome to show their disapproval of the government's plans to eliminate tenure and increase teaching loads. The rally was the latest in a series of protests against a reform plan the government says would provide much-needed flexibility but which faculty members fear could drive away the country's best young brains.

    “This has shown that all the people at all the universities are united,” says Piero Tosi, rector of the University of Siena and president of Italy's Conference of Rectors of Italian Universities (CRUI), which views the reforms as an intolerable roadblock to those entering the profession.

    In January, Letizia Moratti, Italy's education and research minister, unveiled a draft law that would apply to the majority of the country's 50,000 researchers and professors at its 70 universities. It would replace the current tenured research track with a series of fixed-year contracts at each step along the academic ladder, regular evaluations, and a national qualifying exam. The reforms address widespread claims that the current system is corrupt, with rigged appointments, widespread nepotism, and mismanagement of public resources. These factors, many believe, have fueled a brain drain of the country's best young academic talent.

    Although many university professors admit that some of these accusations are well founded, they say the proposed reforms would exacerbate the brain drain by creating intolerable roadblocks to entering the profession. The entire process of winning a tenured slot could take as long as 29 years, scoffs Flaminia Saccà of the University of Cassino, who handles research policies for the Democrats of the Left, the main opposition party in Italy. In addition, says Tosi, the reforms do not address the pressing need to improve evaluation of teaching and research efforts on an individual basis. Academics are also upset by the government's push to double their teaching load, now typically two or three courses a year, and by the government's failure to deliver promised funding increases for research.

    The government, which has a solid majority in Parliament, is expected to pass the measures next month, although there could be amendments. Tosi says government officials have agreed to discuss the proposals before the vote, and more protests are planned in order to keep the issue before the public.

  3. FUSION ENERGY

    Euro Meeting Holds Key to ITER Project

    1. Daniel Clery*
    1. With reporting by Dennis Normile in Tokyo.

    CAMBRIDGE, U.K.—The clock is ticking toward midnight for the fragile coalition trying to build the $6 billion ITER fusion reactor.

    This fall research ministers from the European Union (E.U.) set a deadline of 26 November for a decision to begin building the reactor near the French town of Cadarache. But the six partners in ITER are not playing ball: They are currently split down the middle between Cadarache and Japan's proposed site at Rokkasho. Last week, at a meeting in Vienna, Austria, neither the E.U. nor Japan could persuade the other to back down despite both sides claiming to have made major concessions. E.U. officials say they are working to keep the collaboration together, but the ministers' deadline carries with it the implied threat that the E.U. could proceed without the support of all six partners. This week the E.U.'s executives met to prepare recommendations to the ministers.

    ITER's goal is to achieve a sustained fusion reaction and generate more power than it consumes. If it works, it promises almost limitless energy, using deuterium extracted from water as fuel and producing little radioactive waste. But first it must be built, at a cost of $13 billion over its expected 30-year life (Science, 13 February, p. 940). Last December the United States and Korea decided to back the Japanese site, whereas Russia and China favored Cadarache (Science, 2 January, p. 22). Since then each site has been vetted further; delegations have crisscrossed the globe, but neither side has blinked. To break the impasse, the partners have studied the possibility of adding other facilities to the ITER project that would accelerate the move toward commercial fusion power.

    In September, frustrated by the impasse, research ministers from the 25 E.U. member states set the 26 November deadline and implied that they would wait no longer on plans to begin construction (Science, 1 October, p. 26). The threat of such a unilateral move infuriated the Japanese, who accused the E.U. negotiators of displaying an arrogance that could undermine not just ITER but other international scientific collaborations as well.

    A dream decision.

    The six ITER partners are looking for a way to anoint both Europe and Japan as winners in the contest to host the reactor.

    CREDIT: TIM SMITH

    In response, Japan quietly began promoting a deal that would minimize the differences between being host and being nonhost. Under the original plan, says Japan's chief negotiator Satoru Ohtake, “being host is like winning the lottery, and being nonhost is like winning nothing.” Japan's goal, he explained, was to reach a point at which choosing a site would be “like tossing a coin.” But it hasn't fared well, he admits: “I don't think the E.U. ever really imagined being nonhost.”

    In fact, some E.U. negotiators misinterpreted Japan's overtures as a sign that it was willing to support Cadarache, a position reported erroneously by Reuters news service the day before the 9 November ITER meeting in Vienna. That inaccurate information got the talks off on the wrong foot, says one E.U. source, who added that the meeting ended on friendlier terms after the E.U. delegation restated its support for a six-partner solution.

    The deal that the E.U. put on the table would have it contribute 58% of ITER's cost, with four other partners giving 10% each and Japan footing 18%—more than the other nonhosts would give. For its extra money, Japan would get “privileged” status in the project, winning more than 18% of the contracts to provide components and more than its fair share of the management structure. The extra money—contributions would add up to 116% of ITER's nominal cost—would go toward the additional facilities, which Japan would take its pick of. Without showing his hand, Ohtake says that the E.U. proposal “is less generous to the nonhost” than what Japan has offered Europe if the reactor went to Rokkasho.

    E.U. officials remain confident that the ITER partners ultimately will embrace the Cadarache site. But continued disagreement remains a possibility, too. If negotiations break down, says one E.U. official, “ITER must still take place.” But going ahead with less than six partners “would be a failure,” too.

  4. PALEONTOLOGY

    Spanish Fossil Sheds New Light on the Oldest Great Apes

    1. Elizabeth Culotta

    Over the past few decades, paleoanthropologists tracing the human lineage back through time have uncovered a series of increasingly apelike ancestors that date to 4 million to 6 million years ago. Even further back, however, the ancestors of humans and our ape cousins remain mysterious, hidden by a patchy fossil record. Now a Spanish team reports on page 1339 that it has found an exceptionally complete 13-million-year-old fossil that it says is closely related to the earliest members of the great ape family—the large-bodied, long-lived, intelligent clan that includes chimpanzees, orangutans, and humans.

    The new find, from Barcelona, is the most ancient ape to show the upright posture, muzzleless face, and other key traits seen in all living great apes, including people, says paleoanthropologist Salvador Moyà-Solà of the Institut de Paleontologia M. Crusafont in Barcelona, the leader of the team that found the skeleton. In his view, the skeleton is part of the group that gave rise to the living great apes. It illuminates the “total morphological pattern of the early great apes,” he says.

    Other researchers are delighted with the discovery; paleoanthropologist Carol V. Ward of the University of Missouri, Columbia, calls it an “amazing fossil.” But not everyone agrees with the team's interpretations. The ape fossil record of this time, the middle Miocene, is so fragmentary that researchers can reach little consensus on the shape of the ape family tree. “It's a marvelous find, a dream come true,” says paleoanthropologist Steven Ward (no relation to Carol) of the Northeastern Ohio Universities College of Medicine in Rootstown. “But the true phylogeny of the great apes is still open to question and will probably not be resolved by this wonderful specimen.”

    Great-great-grandfather ape.

    A new fossil (reconstruction, above, and face, inset) may be closely related to the earliest great apes.

    CREDITS (TOP TO BOTTOM): S. MOYÀ-SOLÀ ET AL./SCIENCE; © AAAS-SCIENCE/ILLUSTRATION BY TODD MARSHALL

    Although dozens of species of Miocene apes have been named, most fossils are fragmentary—a jaw here, an arm bone there. In December 2002, members of Moyà-Solà's team found a canine tooth and an apelike face at a new site near Barcelona, but major excavations had to wait until the summer of 2003. Then the team recovered the ribs, wrist, hands, and vertebrae of a single male individual within an area of about 25 square meters.

    The resulting creature, named Pierolapithecus catalaunicus after the nearby village of Els Hostalets de Pierola in Catalonia, reveals a mix of apelike and monkeylike traits. Compared to earlier Miocene apes, for example, the face has a much-reduced muzzle resembling the living great apes. The wide and shallow rib cage and details of the vertebrae show that the roughly 30-kilogram creature stood upright, as great apes do. But not on the ground: Pierolapithecus was a tree dweller eating fruits and vegetation in a tropical forest. Although it has flexible wrists like those of tree-swinging apes and bipedal humans, it retains the relatively small hands and straight fingers of monkeys, implying that, like them, it sometimes walked on all fours on tree limbs. “From these fossils we have, for the first time in the Middle Miocene, the key diagnostic features of the living apes,” together with a large set of primitive monkeylike features, says Moyà-Solà.

    The skeletal bones suggest that the early great apes had a somewhat different lifestyle from that of the living ones, says Moyà-Solà. The apelike wrist coupled with a monkeylike hand, for example, suggests to him that our ape ancestors first climbed vertically through the trees and only later began to develop the extensive adaptations for below-branch swinging behavior seen in all living great apes. (Humans lost these adaptations when they came down from the trees.) If the team is right, chimpanzees and orangutans may have evolved these suspensory adaptations independently, says Moyà-Solà.

    Because there are few fossils to put Pierolapithecus into context, opinions about its place in the ape family tree vary widely. The team puts it at the key branch point between the great apes and the smaller lesser apes, represented today by the gibbons. Paleoanthropologist David Begun of the University of Toronto, Canada, however, cites facial features that he thinks link Pierolapithecus to the African apes, the group that eventually led to chimpanzees and humans, rather than to the Asian orangutans. “I'd put it closer to humans than they would, which makes it even more interesting in some ways,” he says.

    On the other hand, David Pilbeam of Harvard University thinks that the new skeleton could be even more primitive than the authors suggest. He is not convinced that the characters the team cites—wrist, vertebrae, face, and ribs—indicate an evolutionary link to great apes, and he suggests that the similarities may be due to convergent evolution. “I didn't think the face looked particularly like any living ape. I'm agnostic about the idea that it is part of the group that gave rise to extant apes,” he says. “If chimp-orang adaptations are convergent, why believe that Pierolapithecus resemblances are not?”

    Even as they debate the ramifications of the find, researchers are united in their appreciation of a fossil that is sure to advance the field. “We can't say yet what it all means,” says Pilbeam. “But this skeleton is great.”

  5. TOBACCO WARS

    Research on Secondhand Smoke Questioned

    1. Dan Ferber

    The Philip Morris tobacco company quietly conducted extensive animal research in the 1980s that documented the toxicity of secondhand smoke while arguing publicly that it was safe, according to an analysis, published online last week by The Lancet, of thousands of industry and court documents. In a related move, the University of Geneva (UG) has raised doubts about more than 3 decades of tobacco-smoke studies authored by a retired UG environmental-medicine professor who coordinated research for Philip Morris. His failure to disclose that he was a “secret employee of the tobacco industry,” according to a UG faculty commission, tainted his research.

    The allegations in The Lancet drew an immediate response from Philip Morris's parent company, Altria of New York City. It issued a statement saying the charges are “highly distorted and misleading” and that the company has successfully defended against similar allegations in U.S. tobacco litigation. The retired UG professor at the center of the storm, Ragnar Rylander, who is now a professor emeritus at the University of Göteborg, Sweden, maintains that his science was independent and that he is the victim of an antitobacco witch hunt.

    The Lancet study grew out of efforts by two antismoking activists, Pascal Diethelm, president of the Swiss antismoking group OxyRomandie, and Jean-Charles Rielle of the Swiss smoking-prevention group CIPRET-Genève. They mined an online database of millions of documents Philip Morris released as part of a 1998 legal settlement with the state of Minnesota. Diethelm, who worked as an information-technology officer for the World Health Organization's Tobacco Free Initiative, knew that Rylander had authored studies exonerating secondhand smoke. A search in the Philip Morris database flagged 16,000 documents in which Rylander's name appeared, including confidential company memos and scientific reports, financial records, and a company consulting contract with Rylander.

    Smoking gun?

    Critics of Ragnar Rylander cite evidence of his work for a tobacco company.

    In 2001, the two Swiss activists publicly denounced Rylander's work on tobacco smoke as “an unprecedented scientific fraud.” Rylander sued Diethelm and Rielle for libel in a Swiss court and won. Diethelm and Rielle appealed; in December 2003 a Swiss appeals court reversed the lower court's decision on grounds that the charges were true. Meanwhile, UG created a faculty fact-finding commission to investigate the charges on its own.

    The Lancet paper involves many of the players in these battles. Its authors—Diethelm, Rielle, and Martin McKee, a public-health physician at the London School of Hygiene and Tropical Medicine who testified in defense of Diethelm and Rielle at their trial—reported finding more than 800 unpublished studies on secondhand smoke completed between 1981 and 1989 at a Philip Morris facility called the Institut für Industrielle und Biologische Forschung (INBIFO) in Cologne, Germany. In one key 1982 study in rats, INBIFO researchers showed that sidestream smoke, which drifts from lit cigarettes, caused severe damage to the nasal epithelium and abnormal cellular alterations called metaplasia sometimes associated with cancer and was up to four times more toxic than the direct smoke sucked from a cigarette. The data were not published.

    The Lancet authors maintain that Philip Morris created INBIFO from the start to learn about the effects of tobacco smoke but concealed the work to reduce liability. For example, the authors say, the company situated the lab in Germany instead of the United States, funded it through a Swiss subsidiary, and told few employees about the tobacco-smoke research. The company contracted with Rylander to serve as an intermediary between INBIFO and Thomas Osdene, a Philip Morris executive responsible for research and development.

    According to the UG report, released in French on 6 September and in English on 29 October, Rylander also organized industry-controlled symposia that excluded researchers who believed secondhand smoke was harmful and failed to identify himself as a tobacco company consultant in letters to the U.S. Environmental Protection Agency downplaying the toxicity of secondhand smoke.

    Last week, UG, acting on advice of the faculty commission, wrote to three journals in which Rylander had published—The European Journal of Public Health, the Archives of Environmental Health, and the International Journal of Epidemiology—warning that “Rylander's work reflects his position as an industry agent rather than a free scientist.” From now on, UG researchers will be barred from accepting tobacco-industry grants or contracts, says André Hurst, the university's president.

    Rylander argues that UG freely accepted Philip Morris's funding for decades and that the authors of the Lancet paper quoted from the tobacco documents out of context to “support their message.” Furthermore, he says that his science was independent, that he was merely INBIFO's “scientific adviser,” and that he had “no insight into their funding.” Quoting the Swedish diplomat Hans Blix, Rylander concludes: “If you believe in witches, and you look hard enough, you'll find them.”

  6. MATERIALS SCIENCE

    Key to Cheaper, Better Nanotubes Comes Out in the Wash

    1. Robert F. Service

    Since their discovery 13 years ago, carbon nanotubes have been nanotechnology's poster child. The tiny straw-shaped molecules are stronger than steel, flexible, and conductive. Researchers have pitched them as the right stuff for everything from chemical sensors and drug-delivery agents to wires for nanoscale computer circuitry and even the building blocks for an elevator extending into space. Their cost, however, is a bit of a problem: At $500 per gram, nanotubes are more than 30 times as expensive as gold. But that price may soon be on its way down.

    On page 1362, Japanese researchers report that by simply adding a little water vapor to a standard nanotube production scheme, they've hit upon a new, highly efficient way to grow nanotubes. If the approach can be scaled up, it could significantly drop the price of nanotubes, opening the door to new commercial applications. The team also reports that the technique makes it straightforward to create macroscale sheets, pillars, and other shapes out of nanotubes, which could become the starting materials for novel types of electronic devices. “The results are quite remarkable and will lead to much follow-up,” says Hongjie Dai, a chemist and nanotube expert at Stanford University.

    In 1991, Japanese physicist Sumio Iijima discovered that nanotubes had grown on the cathode of an arc discharge machine used to make spherical, all-carbon molecules called fullerenes. The machine, which blasts a target of graphitic carbon with a jolt of electricity, turns out a jumble of tubes and soot. Today, most nanotube makers grow their minuscule tubes with the help of tiny nanosized catalyst particles that seed the growth of the tubes inside high-temperature vacuum chambers. The main drawback to this approach is that the resulting tubes wind up contaminated with catalyst particles, which must then be removed through chemical reactions.

    Aquaculture.

    New water-based technique can grow luxuriant columns made of nanotubes.

    CREDIT: K. HATA ET AL./SCIENCE

    In recent years, Iijima, now at the National Institute of Advanced Industrial Science and Technology in Tsukuba, and colleagues have focused on a simple nanotube manufacturing technique called chemical vapor deposition, in which hydrocarbon gases are fed into a superheated chamber containing nanoparticle catalysts. Like other groups, Iijima and his colleagues found that after only about 1 minute of operation, virtually all of the catalysts stopped working. The researchers knew that the high heat broke apart the hydrocarbons, creating a vapor of carbon atoms that link together to form the tubes. The trouble is that the tube must start growing correctly from the catalyst right from the start. Yet in most cases carbon atoms cover the catalyst particles with an amorphous coating that prevents nanotubes from taking shape.

    Other researchers had found that they could remove the amorphous carbon simply by adding pure oxygen. But it works a little too well and quickly oxidizes—or burns—the growing nanotubes. “So we figured we need a weak oxidizer that will not damage the carbon nanotubes,” says Kenji Hata, the physicist who led the current effort. The group decided to look at water, Hata says, because water readily reacts with carbon to create carbon monoxide and molecular hydrogen. Hata found that when he tuned his apparatus to add about 100 parts per million of water to ethylene and other inert carrier gases, the water reacted with the amorphous carbon from the catalyst particles but didn't damage the growing nanotubes. As a result, virtually all of the catalyst particles remained active and quickly produced a forest of nanotubes growing up from a surface. And because of the high efficiency of the growth process, the resulting crop of nanotubes ends up nearly free of catalyst contaminants.

    By starting with catalysts patterned in circles and lines, the researchers grew both pillars and sheets of nanotubes. Because nanotubes have such unique optical, electrical, and thermal properties, patterned tubes may enable researchers to make devices such as optical filters and arrays of electron emitters for flat-panel displays, Hata says.

  7. NIH ETHICS

    Staff Scientists Protest Plan to Ban Outside Fees

    1. Jocelyn Kaiser

    In an unusual collective dissent, more than 170 intramural scientists at the National Institutes of Health are protesting a proposed ban that would prevent them from being paid to advise or speak at institutions that receive funds from NIH. In a letter last week to NIH Director Elias Zerhouni, the scientists, many of them lab and section chiefs, told Zerhouni that a ban on so-called honoraria and other payments is an “error” that risks turning NIH scientists into “second-class citizens in the biomedical community.”

    The letter reflects growing frustration with an ongoing crackdown on consulting, which intramural scientists have mostly endured in silence until now. Many feel that the reforms have gone too far: “There's a tremendous amount of unhappiness,” says one lab chief who, like several other signers, declined to be quoted by name.

    The ethics overhaul began after the Los Angeles Times reported large industry payments to several NIH officials, sparking a congressional investigation and a stringent review of all existing outside activities by NIH. A blue-ribbon panel this spring advised Zerhouni that paid industry work should be banned for top officials and those overseeing grants but permitted for intramural scientists, who are not involved in grant decisions. The panel also deemed it “important—even essential” that NIH continue to allow “reasonable” payments for speaking, writing, and teaching. In the past, NIH routinely approved such lectures, as long as the employee discussed work published at least a year before.

    Squeaky clean.

    NIH's Kington aims to avoid the appearance of a conflict.

    CREDIT: NIH

    But after Congress pressed for further reforms, Zerhouni announced in June that all paid consulting by intramural scientists for grantee institutions—including speaking—should be banned. NIH Deputy Director Raynard Kington explains that being paid to speak poses the “appearance” of a conflict because intramural scientists “are privy to information about the scientific direction of the agency” that could potentially give the grantee institution an unfair advantage. After talking to legal experts, NIH is also now concerned that if employees discuss their government work at all, they could be using public office for personal gain, Kington says.

    The 172 staff scientists who endorsed the 8 November letter disagree. Because intramural scientists are not involved in awarding grants, “there can be no conflict of interest,” states the letter, which was initiated by clinical center ethicist Ezekiel Emanuel. Banning activities that are “an essential part of free academic discourse” simply to allay public concerns “seems unjustified,” the writers say. The letter also says NIH staff should be able to “very modestly augment” their “low salaries” with these payments, and that barring them “will further erode” NIH's ability to recruit and retain good scientists.

    The chief concern, some letter signers say privately, is that because rules have become so restrictive, even being reimbursed for travel to a university for official duty could be disallowed. (Kington says this is not the case.) Others are concerned about NIH's plans for a 1-year moratorium on all industry consulting (Science, 1 October, p. 27) and see the honoraria ban as a starting point for a broader discussion. “Many of us feel this is the first chink in the armor to attack,” says virologist Malcolm Martin.

    Some contend that NIH recruitment efforts could be suffering already. Neuroscientist Bruce McEwan of Rockefeller University in New York City says the NIH intramural program is attractive at a time when extramural grants are getting tighter, but on the other hand, the agency's salary cap and limits on outside income “make the situation very difficult” for academic scientists.

    Kington says there's no evidence that scientists are being driven away from NIH but that the 1-year pause in industry consulting will allow NIH to “measure impact.” Meanwhile, Zerhouni and Kington plan to meet with the scientists who signed the letter on 29 November.

  8. AIDS VACCINES

    The First Shot in a Highly Targeted Strategy

    1. Jon Cohen

    For biologists, it's a new way of doing business. Last week, the National Institute of Allergy and Infectious Diseases (NIAID) unveiled a highly targeted strategy to tackle immunological puzzles bedeviling AIDS vaccine research. Rather than rely on individual researchers to submit grant proposals, NIAID is planning to bankroll a collaboration called the Center for HIV/AIDS Vaccine Immunology (CHAVI) to focus on problems already identified as key obstacles to the development of an effective AIDS vaccine. The proposed venture could pump at least $300 million into the field over the next 7 years. And more such initiatives are on the way.

    CHAVI is the first tangible outgrowth of the Global HIV Vaccine Enterprise, an unusual international effort to set priorities for AIDS vaccine research and development and to organize the field to tackle the critical problems. First described in Science last year (27 June 2003, p. 2036) and endorsed by leaders of the wealthy “G8” nations at their summit in June, the fledgling enterprise has held several meetings of leading AIDS researchers to draft a “scientific blueprint” laying out goals and strategies. The idea for CHAVI comes from these discussions, says Peggy Johnston, who heads AIDS vaccine development at NIAID. The institute decided to “to get ahead of the game” by taking action even before the blueprint is published, she says. “This is one of [NIAID's] highest priorities.”

    NIAID outlined the plans for CHAVI at a workshop last week. The institute intends to award a 7-year contract to a group of investigators—possibly from several institutions, including ones outside the United States. The collaboration will focus on unraveling the immune responses that have protected monkeys from the AIDS virus in a few well-known experiments that no one has ever satisfactorily explained (Science, 28 June 2002, p. 2325). It will also investigate why some people who are repeatedly exposed to HIV remain uninfected, how the virus changes soon after it establishes an infection, and what the comparative strengths and weaknesses are of various vaccine approaches. CHAVI will need a full-time director, someone willing to give up grants that do not directly relate to the center's research. “It requires a mentality that has not been pervasive in the AIDS community,” says NIAID Director Anthony Fauci.

    Researchers must submit their proposals for CHAVI by 23 February, and NIAID plans to make the award before the 2005 fiscal year ends on 30 September. Fauci does not yet know where the money will come from to fund what could become one of NIAID's most expensive projects but promises that CHAVI will not siphon funds from the institute's existing AIDS vaccine budget (see graph).

    Booster shot.

    NIAID says the new center will not take funds from its existing HIV/AIDS vaccine program, which has been growing steadily.

    CREDIT: DHHS/NIH/NIAID/DAIDSA

    The general idea for CHAVI is drawing praise from the AIDS research community. “All of us here love the proposal,” says J. Michael McCune of the Gladstone Institute of Virology and Immunology in San Francisco, California, one of about 50 researchers who attended the meeting (which was broadcast live on the Internet). “It's a very important, great first move,” adds Lawrence Corey of the University of Washington, Seattle.

    Other significant initiatives could come in the next few months from the Global HIV Vaccine Enterprise. Jose Esparza, a senior adviser to the Bill and Melinda Gates Foundation, who serves as the enterprise's interim secretariat, says a final draft of the scientific blueprint is on his desk, and he expects it to be published “very, very soon.” The Gates Foundation also plans in the next few months to announce its own grants that tie into the blueprint. “Our intention is to provide a very significant level of new resources,” says Richard Klausner, head of the foundation's Global Health Program.

  9. MEDICINE

    Estrogen's Ties to COX-2 May Explain Heart Disease Gender Gap

    1. Jennifer Couzin

    Estrogen and the enzyme COX-2 have more in common than their ability to spark enthusiasm tainted by controversy. It turns out that one recruits the other in the blood vessels of mice. This discovery raises the possibility that COX-2 inhibitors—including Vioxx, which was yanked off the market in September because of cardiovascular risks—might be particularly hazardous to females, especially younger ones whose bodies are still churning out estrogen. The work may also help explain how estrogen production in premenopausal females benefits the heart, a hot-button issue ever since hormone replacement therapy in older women was found to boost heart disease.

    “It opens a new option for reevaluating phenomena which we have been wondering about for decades,” says Kay Brune, a pharmacologist at the University of Erlangen in Germany. Although he and others caution about extrapolating the findings to humans, Brune nonetheless calls them “potentially of enormous clinical significance.”

    The study, published online by Science this week (www.sciencemag.org/cgi/content/abstract/1103333), was led by Garret FitzGerald, a pharmacologist and cardiologist at the University of Pennsylvania in Philadelphia—and an outspoken critic of COX-2 inhibitors. He and his colleagues were curious about a fatty acid called prostacyclin, which is produced by COX-2 and whose synthesis inhibitors such as Vioxx block. Cardiologists consider the loss of prostacyclin a likely culprit behind Vioxx's woes (Science, 15 October, p. 384).

    Unprotected.

    Plaques (red) spread in the arteries of a female mouse lacking a key receptor.

    CREDIT: K. M. EGAN ET AL./SCIENCE

    FitzGerald's group created mice genetically susceptible to atherosclerosis and lacking the prostacyclin receptor. The scientists were intrigued to see that in these animals there was no gender gap in heart disease—a divergence long observed in both people and mice in which younger males are at higher risk than younger females. “Females caught up to males,” developing atherosclerosis just as quickly, says FitzGerald. A closer look revealed that without the prostacyclin receptor, female mice were highly susceptible to oxidative damage from free radicals, which boost plaque formation in arteries.

    Then the team sought to bring estrogen into the picture, turning to mice whose ovaries had been removed and who were given supplemental estrogen. The hormone supplements, FitzGerald and his colleagues found, increased prostacyclin biosynthesis and depressed oxidative stress; both effects were tracked in urine samples.

    This suggests that in premenopausal females, estrogen, acting through one of its receptors, stimulates COX-2 production. That boosts prostacyclin, which in turn protects the heart from atherosclerosis, says FitzGerald. This may explain much of the gender gap in heart disease, because that estrogen-driven pathway would be much weaker in males.

    But as wary researchers know, mice aren't people. For decades, based on studies in mice and other animals, millions of menopausal women took hormone supplements believing the drugs would stave off heart disease. Then, in 2002, the Women's Health Initiative (WHI) reported that women on hormones were more likely to suffer heart problems than those on placebos.

    One question raised by the new study is whether women in WHI taking COX-2 inhibitors or general nonsteroidal anti-inflammatory drugs influenced the study's outcome. Women in the study on aspirin didn't appear to be at higher risk, says Richard Karas of the molecular cardiology research institute at Tufts University in Boston. But FitzGerald is beginning to comb through the hormone data from WHI and other studies to take a second look. His team's findings, he notes, could be much more applicable to younger women; although the published COX-2 inhibitor studies haven't noted gender gaps in heart risks, most participants were older.

    Two other COX-2 inhibitors, Celebrex and Bextra, remain on the market, and scientists are trying to determine whether they share Vioxx's risks. FitzGerald's work “raises the question of whether there are gender-specific risks related to the COX-2 inhibitors,” says Karas. If it holds up, he adds, it also suggests that the cardiac hazards “are not some weirdo side effect of Vioxx that's not true in the other” drugs.

  10. VOLCANOLOGY

    Iceland's Doomsday Scenario?

    1. Richard Stone

    The more researchers learn about the unheralded Laki eruption of 1783, the more they see a need to prepare for a reprise that could include fluoride poisoning and widespread air pollution

    SKAFTÁRTUNGA, ICELAND—Hildur Gestsdóttir shovels a heap of fine black soil onto a growing mound beside the unmarked grave, grateful for a breeze from a nearby glacier that's taking the edge off the strong summer sun. “It's a lovely day for gravedigging,” a member of her team remarks. Hildur agrees: “Conditions are perfect.”

    Hildur ought to know, having exhumed about 50 skeletons to date with the Institute of Archaeology in Reykjavik. Usually she's after the remains of Vikings, who settled the island 1000 years ago, or later medieval inhabitants. This grave is much more recent, dating from the late 18th century. Although the period is not her forte, the skeleton beneath Hildur's feet on Búland farm could well be a researcher's treasure, offering clues to why the eruption of the nearby Laki fissure in 1783 was so deadly. One of the largest and least appreciated eruptions in recorded history, Laki killed 10,000 Icelanders—roughly one in five—and recent studies suggest that its billowing plumes led to extreme weather and extensive illness that may have claimed thousands more lives in Britain and on the European continent.

    “It's hard to fathom the impact of Laki,” says volcanologist Thorvaldur Thordarson, a leading expert on the eruption. A similar blast in modern times would pump so much ash and fumes into the upper atmosphere that the ensuing sulfuric haze could shut down aviation in much of the Northern Hemisphere for months, Thordarson and Stephen Self of Open University in Milton Keynes, U.K., argued last year in the Journal of Geophysical Research.

    “It's not a matter of if but when the next Laki-like eruption will happen” in Iceland, says Thordarson, who splits his time between the University of Iceland and the University of Hawaii, Manoa. “We certainly don't want to be here when another Laki-type event hits,” adds Self. Offering a tame glimpse of what the future may hold, the brief eruption of Iceland's Grímsvötn volcano earlier this month led to the cancellation or rerouting of transatlantic flights. Still, volcanologists say, the odds of a full-blown fissure eruption in this century are low.

    By examining presumed victims of Laki, Hildur and her colleagues, including project leader Peter Baxter, a medical researcher at the University of Cambridge, U.K., are testing a thesis that fluoride in Laki's emissions poisoned people directly and may account in part for the high death toll. “It was the greatest calamity to affect Iceland since human occupation began there,” says Baxter.

    During the eruption, an estimated 1 million tons of hydrofluoric acid were deposited over Iceland, contaminating the country's food and drinking water supplies. Icelanders who lived through the eruption noted that sheep and other livestock developed knobbly protrusions from their bones that were clearly visible under the skin—a telltale sign of fluorosis. Baxter's team is the first to exhume presumed victims of Laki to look for abnormal bone growth and high levels of fluoride that could well have led to fatal poisoning in people during the later months of the eruption.

    If they are right, Iceland's fissure eruptions may be much more dangerous than scientists had supposed. And this realization implies that civil-defense planners need strategies for the next Laki-like event. “It's important to consider what the next one is going to do, and how we can prepare for it,” says Clive Oppenheimer, a volcanologist at the University of Cambridge.

    Putting her back into it.

    Archaeologist Hildur Gestsdóttir digs up graves on Búland farm in search of Laki's victims.

    CREDIT: R. STONE/SCIENCE

    Fire and brimstone

    Laki, a fissure in the basalt lava fields of Iceland's southeastern fringe about 50 kilometers north of Búland, embraces 140 volcanic vents that seem to march in a neat row, 27 kilometers long, toward massive Grímsvötn in the northeast. Hunkered beneath a glacier, Grímsvötn is a restless giant, awakening every 10 to 15 years on average. Its eruptions, including one that began on 1 November and lasted 5 days, unleash torrents of glacial meltwater—awe-inspiring floods called jökulhlaup—onto the coastal plains.

    Iceland is the only spot on Earth above sea level where fissures, formed by spreading at midocean ridges, are likely to erupt on a titanic scale. Laki-like events happen every 500 to 1000 years or so, although “you can have a group of eruptions in a very short period,” says Thordarson. And if fluorine were gold, Iceland would be a fabulously wealthy nation. “Not all magmas are fluorine-rich. Persistent offenders seem to include volcanoes in Iceland and Melanesia,” says Oppenheimer.

    The last time Laki roared to life, all hell broke loose. Reverend Jón Steingrímsson was an eyewitness who recorded his observations of the 1783 eruption in his Eldrit, recently translated into English as Fires of the Earth:

    Around midmorn on Whitsun, June 8th of 1783, in clear and calm weather, a black haze of sand appeared to the north of the mountains nearest the farms of the Sída area. … That night strong earthquakes and tremors occurred.

    Steingrímsson's chronicles of the months-long spectacle are “phenomenal,” says Thordarson. He was the first to describe “Pele's hair”—ash “shaped like threads,” Steingrímsson wrote, “blue-black and shiny, as long and thick around as a seal's hair.” He also first recorded spatter bombs: blobs of lava hurled into the air that splat “like cow dung” after landing, says Thordarson. Whereas volcanoes like Mount St. Helens and Pinatubo erupt explosively, Laki and its brethren erupt effusively, similarly to the relatively tame lava fountains spilling out of Hawaii's Mount Kilauea. As one of the largest documented effusive eruptions, Laki lasted 8 months, disgorging an estimated 14.7 cubic kilometers of lava, approximately 150 times the average amount for a basalt eruption and enough to cover 580 square kilometers of the island.

    Laki cast a deathly pall over Iceland. “The entire country was basically engulfed in volcanic fumes,” says Thordarson. “They couldn't even go out fishing; they would get lost in the haze.” The prodigious emissions, he says, included an estimated 122 million tons of sulfur dioxide, 7 million tons of hydrochloric acid, and 15 million tons of deadly hydrofluoric acid.

    The foul smell of the air, bitter as seaweed and reeking of rot for days on end, was such that many people, especially those with chest ailments, could no more than half-fill their lungs of this air, particularly if the sun was no longer in the sky; indeed, it was most astonishing that anyone should live another week.

    Vapors of death

    Laki's acrid tendrils reached far beyond Iceland's shores. Scores of published accounts of weather conditions on continental Europe during the summer of 1783 refer to a persistent haze, or “dry fog,” that lasted months. Ever ahead of his time, Benjamin Franklin reasoned that the dull sun and blood-red sunrises and sunsets were the result of a volcanic eruption in Iceland.

    Piecing together a more detailed picture of Laki from historical documents and from their own computer modeling, Thordarson and Self concluded in the Journal of Geophysical Research last year that Laki's eruption columns rose as high as 13 kilometers into the air. “No one had envisioned such explosive powers in a fissure eruption,” says Thordarson. The resulting aerosol veil hung over the Northern Hemisphere for more than 5 months.

    Laki writ small.

    This month's eruption of Grímsvötn led to flight cancellations and sent farmers in Iceland scrambling to shelter livestock from fluorine-rich ash.

    CREDIT: R. STONE/SCIENCE

    In a cruel twist, the volcanic haze rolled in just as Europe was wilting in an unusually hot summer. The fumes, some researchers argue, sent thousands of people to an early grave. In a paper in press at the journal Comptes Rendus, a team led by John Grattan of the University of Wales in Aberystwyth reports that, according to burial records, there were 25% more deaths than usual between August 1783 and May 1784 in 53 parishes across France. Extrapolating these numbers countrywide, they write, Laki's death toll in France “may be far in excess” of the 16,000 people whose deaths have been linked to air pollution and oppressive heat during the summer of 2003.

    Extending Grattan's work in England, Oppenheimer and Claire Witham of the University of Cambridge reported last May in the Bulletin of Volcanology that about 20,000 people in England alone succumbed to climate anomalies in the summer of 1783 and the following winter. Scouring burial records of 404 English parishes over a 50-year period spanning the years 1759 to 1808, they found that August-September 1783 and January-February 1784 were especially lethal months. Weather records confirm that the summer of 1783 was notably hot in England.

    The following winter was one of the most severe ever recorded in European annals. Anecdotal reports point to a shortage of firewood throughout Europe, and Europeans were dying in droves during that winter, according to findings published by Grattan and colleagues in the late 1990s. The mean surface cooling in Europe during 2 years following the eruptions was about 1.3°C, according to Thordarson and Self. They blame Laki's aerosols for having disrupted the Arctic “thermal balance.”

    Oppenheimer acknowledges that it's “a challenge” to make a direct link between Laki and the sharp spike in mortality in 1783–84. “People may have been hit by a cocktail of things,” he says. But if Laki were the primary cause, it would be the third most deadly eruption in history, after Tambora in 1815 and Krakatau in 1883.

    Volcano victim?

    Nodules at the top of this pelvis, which once belonged to a blonde-haired woman in her 30s, suggest fluorine poisoning.

    CREDIT: ALEXANDER H. JAROSCH/INSTITUTE OF EARTH SCIENCES, UNIVERSITY OF ICELAND

    Divining the bones

    A tractor, engine growling, zigzags along a slope above Búland farm, tilting precariously as it turns hay. From the top of the hill you can see for miles: waterfalls fed by glacial runoff plunge through crags in verdant hills, sheep graze near a braided river, and a vast plain of moss-encrusted Laki lava stretches to the horizon.

    Near the farmhouse, a cluster of dwarf birch trees mark the boundary of what once was a cemetery. In the 18th century, the graves abutted a church that has long since vanished.

    Baxter kneels next to the mound of soil dug from the grave. Death is his forte. Nicknamed by colleagues “Dr. Doom,” Baxter has pioneered the study of how volcanoes kill (Science, 28 March 2003, p. 2022). “This is remarkably fine ash,” he says, sifting it through his fingers. Long ago, fluoride compounds were identified as the culprit responsible for much of the loss of livestock during and after the eruption. Baxter notes that fluoride salts adhere to ash particles, which in turn would have clung to the vegetation and would have been consumed in prodigious amounts by grazing animals. Even during this month's reawakening of Grímsvötn—part of the same volcanic system as Laki —farmers in areas where the ash fell brought livestock indoors to prevent the animals from ingesting fluoride-laced ash.

    Serene desolation.

    The abandoned graveyard of Ásar church overlooks a vast plain of weathered Laki lava; a skull emerges from excavations at Búland farm.

    CREDITS: MUTSUMI STONE/SCIENCE

    During the Laki eruption, “fluorine poisoning was observed all over Iceland” in the form of bone malformations, says Thordarson. “We know the livestock were being poisoned and that within months people started dying,” says Hildur. “But no one wondered whether people were also dying from direct poisoning” from contaminated food or water. Indeed, says Baxter, Steingrímsson's accounts of abnormal bone growths in people were long overlooked. The reverend wrote:

    Those people who did not have enough older and undiseased supplies of food to last them through these times of pestilence also suffered great pain. Ridges, growths and bristle appeared on their rib joints, ribs, the backs of their hands, their feet, legs and joints. Their bodies became bloated, the insides of their mouths and their gums swelled and cracked, causing excruciating pain and toothaches.

    In their pilot study, Baxter and his colleagues have looked for graves in cemeteries at Búland and the nearby Ásar church that were abandoned at the end of the 19th century. Hildur and her colleagues dated the graves according to layers of volcanic ash in the soil. The grave at Búland, for example, was dug shortly after the ash fell from Laki but well before the ash from an 1845 eruption of the Hekla volcano.

    Hildur, in a hole that's now more than a meter deep, uses a spade to clear dirt from the coffin lid. She dons surgical gloves and begins removing pieces of the decaying wooden lid. After a half-hour of painstaking work, she exposes the skull to the light of day. Its matted blonde locks are stunningly preserved.

    Hildur passes the fragile remains to her officemate, Gudrún Alda Gísladóttir, who stows them in Ziploc bags. The pelvis, from a woman apparently in her 30s, has nodules protruding near the top edge. “This is very unusual,” says Baxter. “It may well be the result of fluorine poisoning.” The pelvis of another presumed Laki victim exhumed in the spring was similarly misshapen. “Two graves, randomly chosen, showing the same changes,” he remarks. To trigger such bone growth, “you would have to have really slugged them with fluoride.”

    The heftiest doses would have come through drinking water, possibly up to 30 or 40 parts per million—as much as 30 times the permissible level today, says Baxter. “It was high enough that you would have felt sick if you drank the water,” he says. “But they were in such a terrible state, they had no choice.” The Icelanders were already suffering from deficiencies in vitamins C and D. “Then add fluorine,” he says. “Nutrient deficiency could have made the population much more susceptible to fluorine poisoning.”

    In September, bone samples were shipped to the University of Cambridge for testing. There, a team led by Baxter and Juliet Compston is measuring the levels of fluoride and other trace elements, such as arsenic, in ashed bone samples. “It could be a soup of chemicals from the volcano,” says Baxter. Georges Boivin of the University of Lyon, France, is now using x-ray diffraction to determine precisely how the fluoride ions were substituted for other minerals in the bone's apatite crystal matrix. Results are due by the end of the year. Baxter hopes that the preliminary findings will lead to a “robust” study involving many exhumations, which could nail the fluoride link.

    Superhot spot.

    Laki's 140 vents run diagonally from Katla to Grímsvötn at the center of Iceland's volcanic zones.

    CREDIT T. THORDARSON/UNIVERSITY OF ICELAND

    The next apocalypse

    The Laki eruption has been a tragedy lost in time. “People ignored it for so long,” says Thordarson.

    That's changing. Volcanologists now view Laki as a potent warning, and some are considering what could be done to prepare for a reprise, beyond protecting food supplies and handing out respiratory masks.

    Some potential consequences could not have been dreamed of the last time Laki erupted. The atmosphere, laden with charged particles, would bristle with electricity, possibly interfering with satellite communication. And the plume could well wreak havoc on civil aviation. “How would British Airways deal with its jets being grounded for 5 months?” asks Thordarson. “Planners would be smart to think ahead about how they might deal with such a contingency,” says Christopher Newhall, a volcanologist with the U.S. Geological Survey in Seattle, Washington. However, he notes, “the chances of the next one happening in our lifetimes is relatively low.”

    At the moment, Iceland's fissures do not seem to be up to trouble. “We have not seen potential precursors for an eruption,” says Freysteinn Sigmundsson of the Nordic Volcanological Centre in Reykjavik, who serves on the science committee of Iceland's civil defense department. Precursors could include earthquakes, deformation of the earth's crust, or an uptick in geothermal heat.

    But volcanic fissures are hard beasts to track. A full-blown fissure eruption would follow an upsurge in magma from reservoirs near the crust-mantle boundary about 10 to 20 kilometers below the surface. Before the next Laki-type eruption, huge volumes of magma need to accumulate—as much as 15 cubic kilometers, roughly the amount generated under all of Iceland over a span of 100 years, Sigmundsson says. Although a strategy for monitoring precursors of such events remains elusive, he says, satellite radar imagery can detect crustal deformation—and thus magma accumulation—as deep as the crust-mantle boundary. “Judging from Laki, we would have 3 to 4 weeks of precursor activity,” mainly in the form of earthquakes,” Thordarson says.

    Yet there are uncertainties galore. High magma pressure at Grímsvötn and Katla—a volcano just to the southwest of Laki—could trigger a failure of the plate boundary between the volcanoes, which in turn could spark a fissure eruption, Sigmundsson says. Civil defense officials will remain vigilant for signs of such an event, he says: “We're following the situation closely.”

    A Laki-esque eruption could also occur in other volcanic systems in Iceland. Katla's current bout of insomnia is particularly disconcerting. The biggest fissure eruption in recorded in history was that of the Eldgjá fissure just east of Búland and connected via its plumbing to Katla. Over 6 years beginning in 934 C.E., Eldgjá spewed about twice the amount of sulfurous materials into the air as Laki later produced. “Eldgjá had a huge environmental impact and probably stopped settlement of Iceland for some years,” says Thordarson. “In that eruption the fissure and Katla volcano erupted simultaneously.”

    Current scientific interest in Laki and its ilk stems in some measure from a new appreciation for the observations of Steingrímsson, who saw the eruption as a religious apologue that would die with him unless he committed it to paper. As he wrote in his forward to Eldrit, “I thought it would be unfortunate if these memories should be lost and forgotten upon my departure.” The deformed, fluoride-laden bones that Hildur and Baxter have unearthed may provide another powerful testament to the peril of taking Iceland's fissures lightly.

    Thordarson, for one, is intent on persuading colleagues and the general public that Laki is a sleeping giant that cannot be ignored. “We're much better off if we prepare ourselves for the worst-case scenario,” he says. “I'm not trying to be a doomsayer. But it could happen tomorrow.”

  11. HIGH-ENERGY PHYSICS

    Rara Avis or Statistical Mirage? Pentaquark Remains at Large

    1. Charles Seife

    Two years after its surprise appearance in debris from a nuclear collision, some researchers suspect an exotic particle may be a will-o'-the-wisp

    Two years ago, scientists in Japan and then the United States made headlines around the world by announcing that they had found an unusual particle. This creature, dubbed the Θ+ (theta-plus), was apparently made of five quarks rather than the two or three quarks that make up all other known quarky matter in the universe. That unique property would make the so-called pentaquark a totally new way to probe the forces that hold atoms together (Science, 11 July 2003, p. 153). “It's a fantastic beast—if it exists,” says Ted Barnes, a physicist at Oak Ridge National Laboratory in Tennessee.

    But the beast might be mythical. Even though a dozen experiments have independently claimed to detect the Θ+ particle in their data sets, and physicists at the Thomas Jefferson National Accelerator Facility (JLab) in Newport News, Virginia, are trying to corner the particle, some particle physicists are murmuring their disbelief. As negative results and inconsistencies pile up, many scientists suspect that pentaquark aficionados are chasing a phantom.

    Edward Hartouni, a physicist at Lawrence Livermore National Laboratory in California, is part of a team that pored through the debris of a billion energetic particle collisions, searching for evidence of pentaquarks. If the particles exist, they should show up on data plots as a huge spike. They are missing. “There is no large peak here,” Hartouni says.

    Shaky pillar?

    Data spikes hinting at new type of matter have drawn fire from skeptics.

    SOURCE: K. HICKS/OHIO UNIVERSITY

    The case in favor

    It's hard to believe that something that has been spotted at so many laboratories might be an artifact. Indeed, these laboratories seem to have spotted the Θ+ in different ways. Some, including the SPring-8 experiment in Japan and those at JLab, zap nuclei with light. Others, including experiments in Russia and Germany, smash mesons or protons or electrons into nuclei. The result in each case seems to be the same: a “peak” in the data that signals the brief life of a five-quark particle—whose mass is a bit more than one-and-a-half times that of the proton—that quickly decays into a handful of smaller particles.

    Particle physicists have been finding such peaks for decades. In the spray of debris after a collision, there's a wealth of information as to what happened. Scientists with sufficiently good detectors can look at the tracks of the debris and identify what those particles were. By tracing the tracks backward and seeing how they combine or split apart or kink or curve, physicists can infer what sorts of particles were created in the smashup, how heavy they are, how much charge they carry, and what they decay into. Often, scientists will graph data in a way that shows the number of times that a certain collision yields an event with a given energy. A lump in that data will often indicate the repeated creation of a particle with a given mass-energy; for example, scientists running the right type of experiment will see a clear peak at 1520 MeV that indicates the creation of a three-quark particle known as the Λ(1520).

    The better the detectors are and the more events are analyzed, the more starkly a particle's lump will stand out amid the bumps and wiggles of background noise and statistical fluctuations. According to Kenneth Hicks, a physicist at Ohio University, Athens, and member of the JLab team, lots of experiments have detected a peak that would signal the creation of the Θ+ particle—a very narrow peak at about 1540 MeV—with good statistical significance. “Some appear to be very significant,” he says. “About one dozen experiments have seen it with statistics better than 1 in 1000.” Or, more precisely, there have been quite a number of “three-sigma” detections, which are the informal gold standard of statistical significance under many circumstances in particle physics. “Many [detections] are higher: four, five, six sigma,” adds Hicks. Five- and six-sigma results are usually considered quite high quality, and one that turns out to be false can wind up being a high-profile embarrassment (Science, 29 September 2000, p. 2260).

    The case against

    Even so, some physicists remain unconvinced. For a start, the apparent sightings pose theoretical problems. The narrower the peak, in general, the more stable the corresponding particle, and pentaquarks shouldn't be all that stable. According to theories about the forces that bind nuclei into stable packages, it's difficult to see why a five-quark ensemble, for example, wouldn't rapidly decay into a baryon (with three quarks) and a meson (with two). “It should spontaneously fall apart,” says Barnes. “There's no reason for it to stay together.” If some unknown mechanism keeps it from splitting into fragments, it must bind the five quarks quite tightly to create the narrow peak.

    Although this is difficult to understand, it might be the sign of exotic new physics for theorists to figure out. However, although the peak is narrow—25 MeV wide or smaller, with its apex pinned within 5 or 10 MeV—its center seems to move around. The experiments that claim to have seen the Θ+ peg its mass-energy at anywhere between about 1525 MeV and 1555 MeV. “That's an extraordinarily large range,” says Michael Longo, a physicist at the University of Michigan, Ann Arbor.

    Oddball.

    All other known quark-based particles contain either two or three quarks; controversial pentaquark would boast five.

    CREDIT: K. BUCKHEIT/SCIENCE

    “It worries some people, but it doesn't worry me,” counters Hicks, who argues that the inherent errors in pinning down masses can account for the differences between the experiments. “It's consistent within the error bars.”

    More disturbing to skeptics, though, is that a number of efforts to mine old data for signs of the Θ+ have come up empty. For example, Longo's team reanalyzed the debris of proton-proton collisions at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, and failed to find any five-quark particles. “One of the things we thought we would be able to see is the pentaquark state Θ+,” he says. “We thought we'd just verify the existence of the state. It just wasn't there.” Hartouni's group also looked at collisions at Fermilab, and although they found a nice sharp peak that signaled the existence of the nearby Λ(1540) particle, there was no hint of a spike for the Θ+ at 1540 MeV. “At 1540, we see essentially no [Θ+] production,” he says.

    Hicks calls the Hartouni result “one of the most significant results of a nonobservation.” But with so little known about how pentaquarks are produced, he says, it's possible that high-energy collisions like Fermilab's might not create pentaquarks in the same manner as the lower-energy collisions at JLab. Others are less sure. “If, in fact, the JLab results are confirmed, we have a true puzzle,” says Hartouni. “If the production mechanisms are different, we have to think very hard as a field.” Some, such as physicist Alex Dzierba of Indiana University, Bloomington, go even further and call the Fermilab results and an increasing number of other nonsightings from laboratories in the States and in Europe “an overwhelming body of negative evidence.”

    How, then, to account for the dozen sightings at different labs? There are a number of mechanisms that could lead to a hump in peakless data. Robert Chrien, a physicist at Brookhaven National Laboratory in Upton, New York, says the peaks could be artifacts caused when physicists weed out their data sets to reduce statistical noise. “As soon as you make cuts, you can introduce bias,” says Chrien. “We once thought we saw a peak in a gamma ray spectrum exactly at an energy predicted by one of the theories,” he says of an unrelated experiment—a peak that disappeared as soon as the biasing cuts were wiped out.

    There are other possibilities, too. Dzierba suspects that some of the Θ+ sightings are due to “ghost tracks”: incorrectly reconstructed particle trails that plague high- energy physics experiments. Sometimes even the most sensitive equipment will see two particles when only one exists. Dzierba says an incorrect tally of certain reactions involving the Λ(1520), pions, and protons could make a peak at “precisely the mass of the Θ+.” Other exotic effects, such as a “reflection” of another peak, might cause a miragelike hump in the data. “There's lots of possibilities,” says Barnes. However, he adds, coming up with alternative explanations for the pentaquark peak “doesn't really mean anything. You can't say it doesn't exist. The issue is going to get settled with a really good experiment.”

    That's what the folks at JLab hope, too. “As of now, there's no clear experimental evidence for either the existence or nonexistence of the Θ+,” says Stepan Stepanyan, a physicist at JLab, who has been performing a high-statistics search for the pentaquark at JLab. Although the team has gathered roughly five times as many data as the first JLab sighting, it is not yet ready to release its analysis. If the peak disappears or stays small, then the pentaquark will almost certainly be an artifact of the analysis. If the peak gets starker—and if the team is careful about their cuts and weeds out the ghost tracks and phantom reflections—then it will likely mean that the Θ+ is real, and that a seemingly unstable beast gets stability from an unknown mechanism.

    “The stakes are high,” says Stepanyan. “If the Θ+ exists, then our naïve picture of [nuclear] structure will change.” If it doesn't exist, though, then the pentaquark hunters will add their names to the rolls of those who went hunting for big game and wound up on a wild-goose chase.

  12. HUMAN EVOLUTION

    Faster Than a Hyena? Running May Make Humans Special

    1. Carl Zimmer*
    1. Carl Zimmer is the author of Soul Made Flesh: The Discovery of the Brain—and How it Changed the World.

    Scientists propose that hominids evolved into long-distance runners 2 million years ago to become better scavengers on the African savanna

    Depending on your point of view, last week's New York City marathon was a demonstration of athletic excellence or of unparalleled masochism. But according to a report in this week's issue of Nature, it was also a display of a key innovation in human evolution. University of Utah biomechanics expert Dennis Bramble and Harvard physical anthropologist Daniel Lieberman argue that the human body is exquisitely adapted for endurance running. They marshal evidence that the ability to run long distances emerged 2 million years ago, possibly enabling our ancestors to become better scavengers. If the researchers are right, running goes a long way toward explaining why our bodies are so different from those of other apes.

    It may come as a surprise to hear that humans excel in running. Obviously, a leopard can leave us in the dust in a short sprint. But over longer distances leopards and most other mammals flag. “Most mammals can't sustain a gallop over 10 to 15 minutes,” says Lieberman. Humans, on the other hand, can continue running for hours while using relatively little energy. “Humans are phenomenal endurance runners, in terms of speed, cost, and distance,” says Lieberman. “You can actually outrun a pony easily.” And yet, he points out, “no other primates out there endurance run.”

    Bramble and Lieberman believe that much of this skill comes from a large inventory of special adaptations in our muscles, tendons, and bones. They emphasize that these adaptations are not all-purpose traits that also help us walk upright. “Running is not fast walking,” says Lieberman. “You do not use the same mechanics.”

    To identify adaptations for running, the researchers have put people and animals on treadmills and measured the activity of various muscles and ligaments, along with the forces a running body generates. The nuchal ligament, for example, which stretches from the base of the human skull to the base of the neck, stands out. “It's an elastic band that has repeatedly evolved in animals that run. Apes don't have it,” says Bramble. He and Lieberman hypothesize that the nuchal ligament helps keep an endurance runner's head from bobbing violently. “Every time your heel hits the ground, your head wants to topple forward,” says Lieberman.

    Humans also have a special arrangement of tendons in their legs (including long Achilles tendons) that can act like springs. These tendons store about half of the energy of each stride and release it in the following one. “Chimps don't have these springs,” Lieberman says, also noting that recovering energy is important for endurance running but not for sprinting.

    Endurance.

    Humans are unique among primates in their ability to run long distances, thanks to a number of recently identified anatomical traits.

    CREDITS (LEFT TO RIGHT): MARY ALTAFFER/ASSOCIATED PRESS; ADAPTED FROM LASZLO MESZOLY

    Bramble and Lieberman have also zeroed in on the importance of a large rear end. By attaching electrodes to the gluteus maximus muscles of very cooperative volunteers, they have found that these muscles contract during each running stride, but not during walking—probably to stabilize the trunk. Chimps, by contrast, “have tiny rear ends,” says Lieberman.

    The fossil record suggests that these adaptations for endurance running emerged together about 2 million years ago, in the early species of our own genus, Homo. Paleoanthropologists have long noted that some early Homo were markedly different from earlier hominids. Australopithecus afarensis, which lived from about 4 million to 3 million years ago, stood 0.9 to 1.2 meters tall and had long arms and a wide pelvis. Homo ergaster, which lived in Africa between 1.9 million and 1.6 million years ago, was about as tall as modern humans and had long legs and relatively short arms. Most researchers ascribed these changes to adaptations for efficient walking. “That's the standard story you'll get in most textbooks,” says Lieberman. But Bramble and Lieberman have a different theory. Endurance “running is the only known behavior that would account for the different body plans in Homo as opposed to apes or australopithecines,” says Bramble.

    John Fleagle, an anatomist at Stony Brook University in New York, is impressed by Bramble and Lieberman's argument and wonders why no one thought of it before. “It's a real head-slapper,” he says. A number of their predictions remain to be tested, he points out, because the fossil record of early Homo is still incomplete. But he expects Bramble and Lieberman's paper to generate a lot of new research.

    The “sketchiest part” of their hypothesis, admits Bramble, is why hominids ran long distances. Paleoanthropologists generally agree that early Homo were primarily scavengers, using stone tools to cut meat off carcasses and crack open bones. “If you get [to the carcass] before the hyenas and the other hominids, you would have a lot of protein and fat at your disposal,” says Lieberman.

    By allowing hominids to get to more protein and fat, Lieberman suggests, running might have fueled the evolution of big hominid brains. “Large brains occur after the evolution of this modern, humanlike body form, and it may have been that the ability to do endurance running released a constraint on human evolution,” he says. He speculates that the importance of endurance running only faded once humans invented hunting weapons.

    “The importance of running to the tens of thousands of people who run marathons every year is not just a fluke,” says Lieberman. “I think it's a result of some important evolutionary history. It may have been lost with the invention of the bow and arrow, but the traces are still there in our bodies.”

  13. AMERICAN SOCIETY OF HUMAN GENETICS MEETING

    Of Worms, Mice, and Very Old Men and Women

    1. Jocelyn Kaiser

    TORONTO, CANADA—More than 5000 experts met here from 26 to 30 October for the annual meeting of the American Society of Human Genetics. Longevity, milk digestion, and cancer were among the topics.

    If your grandparents all lived to a ripe old age, you probably hope that “good genes” will bring you long life as well. Researchers in New York City have been exploring that notion by studying the physiology and DNA of Ashkenazi Jews who have lived for almost a century or longer. Last year, the scientists found that these elders have an unusual profile of lipids in their blood that may explain their longevity. In Toronto, the same group reported that their subjects tend to have a gene variant involved in a lipid pathway previously tied to long life in Caenorhabditis elegans, the model worm.

    This may be the first evidence that a similar longevity gene pathway acts in both worms and people, says Cynthia Kenyon, a researcher on aging at the University of California, San Francisco, who finds the link “really interesting.” In another talk, collaborators of the New York group reported that the centenarian Ashkenazis also tend to carry a particular variant of a gene that causes early aging in mice when it's turned off.

    The Longevity Genes Project led by Nir Barzilai of Albert Einstein School of Medicine includes more than 300 Ashkenazi Jews with an average age of 98, as well as their offspring and age-matched controls to the offspring from Ashkenazi families with average life spans. The blood of these centenarians contains especially large lipoprotein particles, which are normally seen only in young, exercising adults (Science, 17 October 2003, p. 373). Barzilai's team also has shown that these elderly Jews and their offspring tend to have a mutation in the gene for cholesteryl ester transfer protein, which raises levels of the “good” high density lipoprotein (HDL) cholesterol and also increases particle size of both HDL and low density lipoprotein (LDL), the bad kind.

    At the meeting, team member Gil Atzmon reported that another gene involved in lipid metabolism appears to protect the Jewish centenarians. The gene codes for a protein called apolipoprotein CIII (ApoC-III), which is a component of LDL. ApoC-III stimulates production of harmful lipids called triglycerides; high levels of these compounds raise the risk of cardiovascular disease.

    In the genes.

    Some centenarians may owe their long life to mutations in genes that govern blood lipids.

    CREDIT: RON WINN/ASSOCIATED PRESS

    The centenarian families were more likely to have two copies of an ApoC-III gene with a particular one-base change in its promoter region than were controls (25% compared to 11%). Those with two copies of the ApoC-III variant had lower blood levels of the protein, larger lipid particles, and lower triglyceride levels, as well as lower rates of hypertension and resistance to insulin, two common signs of aging. The “most dramatic” result, says Barzilai, emerged when his team looked for this mutation in blood samples from a separate group of Ashkenazis, some of whom had died: Those with two copies of the ApoC-III gene variant tended to live 4 years longer, on average.

    Intriguingly, the ApoC-III gene is controlled by the gene for a transcription factor called FOXO1. The worm version of this gene is involved in a regulatory pathway that governs aging; altering the pathway can extend, even double, a worm's life span. In the worm, this pathway turns on a set of genes that include those producing lipids, Kenyon's group recently found. That suggests the same set of genes determines life span in both people and worms, Kenyon says: “I'm not sure anyone else has made this connection.”

    In collaboration with Barzilai's group, Harry Dietz's team at Johns Hopkins University has also examined the centenarian Ashkenazi Jews' blood samples for variants of KLOTHO, a gene that when deactivated in mice leads to what looks like early aging. In 2002, Dietz's team reported on a study of the gene in a group of elderly Czechs: Having just one copy of a certain variant of KLOTHO seemed to help people live longer, whereas two copies of this variant led to earlier death.

    At the meeting, postdoc Dan Arking reported that he and his Hopkins colleagues have found similar results in the centenarian Ashkenazi Jews, 44% of whom have died since the study began in 1998. Compared with the study subjects with one copy of the KLOTHO gene variant, those with no copies were twice as likely to have died; those with two copies had a 4.5-fold higher risk. Not only did those with one copy of the allele live longer, but they also had lower blood pressure and higher HDL levels, which means a lower risk of stroke, says Arking.

    The quest for longevity genes “is really tough” because life span is almost certainly controlled by a multitude of genes, says George Martin of the University of Washington, Seattle, editor-in-chief of the Science of Aging Knowledge Environment. Originally skeptical that any single mutation could be tied to human longevity, Martin now says that Barzilai's team “is making progress” at proving him wrong.

  14. AMERICAN SOCIETY OF HUMAN GENETICS MEETING

    Ural Farmers Got Milk Gene First?

    1. Jocelyn Kaiser

    TORONTO, CANADA—More than 5000 experts met here from 26 to 30 October for the annual meeting of the American Society of Human Genetics. Longevity, milk digestion, and cancer were among the topics.

    By some estimates, less than half of all adults can easily digest milk, a trait believed to have first appeared in people who kept dairy animals. Now scientists have traced the genetic roots of milk tolerance to the Ural mountains of western Russia, well north of where pastoralism is thought to have begun. The surprising result may support a theory that nomads from the Urals were one of two major farmer groups that spread into Europe, bringing the Indo-European languages that eventually diverged into the world's largest family of modern languages.

    Almost all mammalian babies produce lactase, the enzyme that digests the milk sugar lactose. But in most animals and many people, the lactase gene is gradually turned off after infancy, leaving them unable to tolerate milk as adults. Two years ago, a team led by Leena Peltonen of the University of Helsinki, Finland, and the University of California, Los Angeles, identified mutations near the lactase gene that are associated with adult lactose tolerance and likely play a role in regulating the lactase gene. Now, Peltonen's team has tried to trace the origins of lactose tolerance by looking at 1611 DNA samples from 37 populations on four continents.

    Milk route.

    A new study suggests that tribes from the Asian steppes (blue circle) migrated to the Ural mountains, where they mixed with locals (red circle), generating a gene variant endowing lactose tolerance that Ural farmers later spread.

    SOURCE: N. ENATTAH

    The populations having the greatest DNA sequence diversity around the lactase gene mutations—suggesting that lactose tolerance first appeared in them—include the Udmurts, Mokshas, Ezras, and other groups that originally lived between the Ural mountains and the Volga River. The trait most likely developed 4800 to 6600 years ago, Peltonen says. Her team linked the lactase gene changes to an ancestral variant that these groups apparently got from intermixing with tribes migrating from the Asian steppes.

    After the Ural peoples gained this earlier form of the lactase gene, the lactose tolerance mutation “probably emerged by chance,” says Peltonen, and then remained because it was beneficial for milk consumption. The Ural groups then likely later spread the variant to Europe—especially northern Europe, which has the highest lactose tolerance today—and the Middle East. The findings support the somewhat controversial theory that nomadic herders known as Kurgans expanded into Europe from the southern Urals 4500 to 3500 years ago, bringing Indo-European languages with them, according to Peltonen.

    “I find [the new study] very interesting,” says population geneticist Luigi Luca Cavalli-Sforza of Stanford University. He notes that a competing idea for explaining the origin of the Proto-Indo-Europeans is that they were crop-growing farmers from the Anatolia region in modern Turkey (Science, 27 February, p. 1323). But the milk study reinforces Cavalli-Sforza's view that both theories are correct: Indo-Europeans migrated to Europe in two waves, first from Turkey and later from the Urals.

    Other geneticists caution that trying to pin down where a gene variant originated is tricky because the people in whom it's most common today may have migrated from somewhere else, or the original population could now be extinct. But if the milk gene's origin holds up, linguists and archaeologists will have new food for thought.

  15. AMERICAN SOCIETY OF HUMAN GENETICS MEETING

    New Prostate Cancer Genetic Link

    1. Jocelyn Kaiser

    TORONTO, CANADA—More than 5000 experts met here from 26 to 30 October for the annual meeting of the American Society of Human Genetics. Longevity, milk digestion, and cancer were among the topics.

    Prostate cancer strikes one in six men on average, and many researchers are looking for the genetic factors that contribute to an individual's risk. In Toronto, cancer geneticists presented data indicating that a relatively common variant of a gene involved in cell growth can raise a man's prostate cancer risk. The researchers also suggested how this variant may spur cancerous growth.

    Researchers at Mount Sinai School of Medicine in New York City had already found mutated versions of the gene, Kruppel-like factor 6 (KLF6), in many prostate tumors; the mutations disrupt KLF6's normal role of inhibiting cell growth (Science, 21 December 2001, p. 2563). Those mutations may have arisen late in life, but could more subtle variations in the gene, ones present from birth, set the stage for prostate cancer down the road? The same Mount Sinai team, led by John Martignetti, and collaborators have now examined this question by drawing on registries from three major cancer centers. The investigators screened for a previously identified KLF6 variant, a single-base mutation, in blood samples from 3411 prostate cancer patients—some with a family history of prostate cancer, some without—and controls.

    The variant was overrepresented in the patients, suggesting that it may predispose men to prostate cancer. About 17% of the patients with a family history of the disease and 15% of the patients with no such history carried at least one copy of the variant, whereas only 11% of the controls possessed a copy. From that data, the researchers calculate that, compared with men lacking the variant, men with at least one copy have an increased prostate cancer risk—about a 50% hike.

    Off balance.

    The protein encoded by KLF6 normally appears in the nucleus (top), but mutations in the gene lead to more of it in the cytoplasm.

    CREDIT: J. A. MARTIGNETTI

    The team also investigated how this particular variation in the KLF6 gene changes its function. They found that the protein encoded by KLF6 can come in three forms of differing sizes; cells with the variant make more of the two truncated forms. Instead of entering the cell nucleus and suppressing cell growth, these shortened KLF6 proteins stay in the cytoplasm, where they have the opposite effect. Tipping the balance of KLF6 proteins toward the short versions could promote cell growth and therefore explain why the KLF6 variant raises cancer risk, Martignetti says.

    Variations in at least three other genes have been identified as raising a man's prostate cancer risk, although the links haven't always been confirmed in different populations. To some cancer researchers, Martignetti's team has made a convincing case about KLF6. “It's about as solid as it could be” for an initial study, says Sean Tavtigian of the International Agency for Research on Cancer in Lyon, France. And although the KLF6 variant alone may only slightly raise a man's prostate cancer risk, it could act in concert with other gene variants, Tavtigian notes. If researchers can pin down how those risks add up, clinicians might one day screen a man's DNA to determine his true risk of prostate cancer—and prescribe preventive strategies to especially susceptible men.

  16. Measurement and the Single Particle

    1. Andrew Watson*
    1. Andrew Watson, based in Norwich, U.K., is author of The Quantum Quark.

    It's a guiding principle in science: to understand something, break it down to its smallest constituent parts. Now metrologists are getting into that game by counting out individual atoms, electrons, photons, or phonons to put their standards on rock-solid foundations

    In a vault in Paris sits an object that haunts metrologists' dreams. A cylinder of gleaming platinum alloy about the size of a mobile phone, it tells the world exactly how much mass makes up a kilogram—the only fundamental standard still defined by a physical object. Measurement experts consider the reference kilogram an annoying anachronism. Most would love to scrap it in favor of, say, a definition based on a set number of gold atoms. First, though, they must learn to count atoms reliably—a project still in the works (Science, 7 May, p. 812).

    Seeing the light.

    NPL's single photon counter relies on superconducting magnetic detectors.

    CREDIT: PATRICK JOSEPHS-FRANKS/QUANTUM DETECTION GROUP

    Similar efforts are under way in labs around the world, as technicians of the extremely small count everything from electrons to photons and phonons in hopes of bringing measurements down to basics. “It's turning all measurements into a counting exercise,” says Patrick Josephs-Franks of Britain's National Physical Laboratory (NPL) in Teddington, near London.

    Consider the electron, the lifeblood of electronic circuits. Neil Zimmerman of the U.S. National Institute of Standards and Technology (NIST) in Gaithersburg, Maryland, Mark Keller of NIST in Boulder, Colorado, and colleagues have developed an electron pump that dispenses single electrons into a capacitor and, once the capacitor's voltage change has been measured, counts them back out again. “They can actually control exactly the number of electrons going into this capacitor,” says physicist Per Delsing of Chalmers University of Technology in Göteborg, Sweden. On the face of it, what Zimmerman calls an electron “bucket brigade” is “very attractive as a fundamental current standard,” he says.

    At the moment, the pump generates currents at best 1000 times too small to be directly useful. But Zimmerman says the electron counter can be used to set a standard for capacitance, the ability of a device to hold electric charge. Zimmerman and Keller also hope to use the pump to test Ohm's law, a fundamental relation that states that the voltage across an electrical device is equal to the product of the current through it and its resistance. Using two tried and trusted methods, the Josephson standard for measuring voltage and the quantized Hall standard for resistance, to supply two of the three sides of the V = iR triangle, NIST's electron pump will supply standardized current to check the law for internal consistency.

    The heart of the electron pump is a superconductor device known as a single-electron tunneling (SET) device. It exploits quantum tunneling, the ability of a particle—in this instance an electron—to leak through what is classically an impenetrable barrier. Electrons leak from the source electrode through one tunnel junction and into a conducting “island” beyond. Electrons can then escape the island via a second tunnel junction into the drain electrode. Voltage applied to a third electrode—the gate—adjoining the island influences the island region and allows electron flow from source to drain. As the gate voltage cycles up and down, electrons are sucked from the source and squeezed into the drain one at a time.

    That's the theory. Because of quantum uncertainty, however, occasionally two electrons move through, and sometimes none. To get reliable electron counting, researchers must strap together several islands to form a pump. The simplest possible pump comprises three junctions separated by two islands, with each island handing off electrons to the next controlled by its own gate. The trick is to keep the gate voltages out of step to maintain the flow. The NIST pump contains seven junctions, enough to guarantee just one error for every hundred million electrons pumped through.

    Chalmers's Delsing has also been working with several European metrology labs on developing ways of using an SET device to watch electrons as they pass through a circuit, rather than pumping them through. Their device works by passing the electrons through a cascade of islands and junctions, a little like the pump, while a separate SET transistor coupled to the chain sniffs electrons as they pass by. This electron counter is phenomenally sensitive, Delsing says. “The single-electron transistor is the most charge-sensitive device ever demonstrated,” he adds.

    Feeling the heat

    In neighboring Finland, SET devices are already being used to measure temperature, exploiting the fact that the electron- trafficking of a string of junctions has a characteristic temperature behavior: a dip in its conductance at about zero voltage changes with temperature. The SET-based thermometer works only below about 30 K, but this could be its strength: “Below 0.65 K or so, there is no standard for temperature,” Delsing says. “This might be a way to actually define the temperature scale at really low temperatures.”

    Another particle that can also be used to measure temperature is the phonon, a quantized ripple that flows through solids carrying heat from one place to another. The warmer the solid, the greater the population of phonons zipping around inside it: A grain of table salt at room temperature is home to about 1018 of them.

    At NPL, Josephs-Franks and his colleagues Ling Hao and John Gallop are using a minute carbon nanotube as a single-phonon bridge between two objects at different temperatures. They have attached a nanotube, about one nanometer in diameter, to the tip of an atomic force microscope probe, forming a kind of proboscis. When they touch this on a surface, phonons can zip up and down the tube—as annular bulges in its wall—between the probe and the substrate. Instead of a smooth variation in temperature between the probe and substrate, the temperature changes in steps, says Josephs-Franks. The device could be used as a nanoscale thermometer to measure “temperature” at different points in macromolecules.

    Counting light

    Researchers have a number of tools, such as photomultipliers and avalanche photodiodes, that help them count light by the photon. But such counters don't meet the exacting demands of metrologists because they can't record each and every photon, and the range of frequencies they can see is limited. The dawn of the quantum information and communications era has given new impetus to the search for perfect photon detectors.

    One such is a new superconductor-based photon counter, which relies on quantum tunneling, developed by Norman Booth at the University of Oxford and his collaborators at Harvard and the University of Naples. When an incoming photon strikes a superconducting surface layer, electron pairs are broken up, sending electrons tunneling into a second layer. Here, these electrons spawn yet more electrons, yielding a detectable electronic blip via a third layer. The device works from the infrared to the x-ray parts of the spectrum.

    Sae Woo Nam of NIST Boulder and his colleagues have also developed a new detector for infrared photons using a thin film of superconducting tungsten. When an incoming photon heats the film, the tungsten momentarily loses its superconducting prowess and becomes an ordinary conductor. Researchers can spot the change by a sudden rise in electrical resistance. The tungsten returns to the superconducting state fast enough that the device can register 20,000 photons every second.

    Go with the flow.

    The voltage on the gate in this SET transistor coaxes single electrons from source to island to drain through the seemingly impenetrable tunnel junctions.

    CREDIT: ILLUSTRATION: J. MOGLIA/SCIENCE

    At NPL, Josephs-Franks and his colleagues have tried a different approach to counting light. They use highly sensitive magnetic detectors called superconducting quantum interference devices (SQUIDs) to encircle a speck of photon-absorbing material. When a photon hits the absorber, it warms it slightly, causing a change in its magnetic properties that is detectable by the SQUIDs. One of the device's potential strengths is an ability to see many frequencies. “In principle it can work from infrared to x-ray,” Josephs-Franks says. The trick is in finding the right absorber.

    Although there's always scope for doing measurements better, the single particle seems to represent some kind of limit for metrology. The charge on the electron, for example, is believed to be constant over time and the same for every electron, so it makes for a perfect standard, says Zimmerman: “If charge comes in fundamental units and cannot be further subdivided, that seems to me like the end of the story.”

  17. Getting the Measure of Nanotechnology

    1. Andrew Watson*
    1. Andrew Watson, based in Norwich, U.K., is author of The Quantum Quark.

    In the realm of the very small, measurements lose their certainty, and materials don't behave as they should. As chipmakers get to grips with circuits too small to measure, researchers are exploiting the oddities of the nanoworld to make new measuring devices

    Next year, when the chip giant Intel ships its first Pentium 5 processors, the transistors inside will set a new record. Their gate length—the distance between source and drain—will have dwindled to 25 nanometers—just 100 atoms across. “That's indeed nanoelectronics in volume manufacturing,” says Alain Diebold of International Sematech, an industry-backed R&D consortium in Austin, Texas. Even smaller things are on the way: Transistors with a gate length of just 5 nanometers have already been demonstrated in the lab.

    Good vibrations.

    Oscillations in this minuscule diving board, which spans a human hair, can detect the magnetism of a single electron.

    CREDIT: IBM

    For decades, semiconductor industry observers have warned that chipmakers will need new measuring and manufacturing technology to cope with shrinking chips. And for decades, engineers have managed to stave off the next technological revolution, as better electron microscopes and materials and clever optical tricks have made it possible to stencil finer and finer features onto semiconductor wafers. Industry has made a huge investment in its current instrumentation and so prefers to “squeeze a few more angstroms of resolution” out of it to keep it on the production line a little longer, says Michael Postek of the U.S. National Institute of Standards and Technology in Gaithersburg, Maryland.

    Now, with 15-nanometer gate lengths looming and 10-nanometer lengths on the horizon, “we're going to begin to start feeling the pinch again,” says Postek. “We know there's going to be a barrier coming. The metrology tools are beginning to fall off.” This time, the experts say, radically different solutions may at last be unavoidable—and some are already in the works.

    One reason the semiconductor industry may have to change its ways is that soon the minute circuits and transistors on chips will be sharing space with minuscule machines known as nanoelectromechanical systems (NEMS). A typical NEMS device incorporates a mechanical element sensitive to force and a means of converting mechanical energy into an electrical or optical signal. The other characteristic is size: NEMS features are typically 10 nanometers, or 40 atoms, across, offering extreme sensitivity to forces.

    But designing and manufacturing NEMS brings new measurement challenges. At the scale of NEMS features, a distance that you measure may not be quite what you think it is. “Distance is not equal to distance,” according to Jörg Kotthaus, a physicist at the Ludwig Maximilians University in Munich, Germany. With a ruler in the macroworld, there are clear endpoints, firm stops against which you can press the ruler. “In the nanoscopic world it's like trying to measure distance on a foam mattress,” he says. And even if you can measure things accurately, two devices of the same size and shape may not behave exactly the same. Material properties rely on the assumption that a material is made up of a huge number of similar atoms or molecules. With NEMS, where the number of atoms or molecules is small, normal macroworld properties break down.

    Device variability is something the semiconductor industry has learned to live with, and the outlook for NEMS devices is much the same, believes Harvard University's Robert Westervelt. “You have to do the engineering in such a way that that kind of stuff doesn't matter,” he says. For example, at small scales the mass of a rod decreases much more rapidly than its linear dimensions because mass is linked to volume. As a result, a NEMS rod vibrates at a very high resonant frequency—up to 1 gigahertz, about the frequency of a mobile phone signal. But that tiny size also means that a few nanometers' difference in length, invisible irregularities in the material, or even the presence of a few contaminant atoms can dramatically change the rod's resonance response.

    Despite the challenges, Kotthaus and his colleagues are using this type of tiny cantilever to make an “electron spoon” to transfer electrons mechanically from one spot to another, perhaps as a way of developing an electrical current standard. It's a scaled-down version of a classroom classic: Charge two parallel metal plates up to a few hundred volts and dangle a lightweight metallized ball between them—the ball will bounce back and forth, transferring charge. At Cornell University, Harold Craighead and colleagues have used a NEMS resonator in a different way, by measuring the change in resonant frequency of a tiny cantilever as a mass is added to the end. Craighead's device can measure masses as small as an attogram (10–18 g), and he is now developing it as a scale for weighing individual viruses and other biological structures.

    Daniel Rugar and his colleagues at IBM's Almaden Research Center in San Jose, California, have also used a NEMS cantilever to create a sensor and now hold a record for the tiniest force detected by mechanical means. They used an ultrasoft cantilever 85 micrometers long and just 100 nanometers thick with a magnetic particle attached to the free end. When the cantilever is vibrated, the strong local field of the magnet can interact with a nearby single electron, which causes a small frequency shift in the cantilever. “We detected the magnetic signal from a single ‘unpaired’ electron spin,” says Rugar. The technique is well-known in the macrolab, where it is called electron spin resonance, a sister of nuclear magnetic resonance. “Compared to conventional electron spin resonance, the technique is at least 10 million times more sensitive,” says Rugar. “The ultimate goal is to develop a technique that can take three-dimensional [magnetic resonance] images of molecules with atomic resolution.”

    Ironically, although it is difficult to make precisely reproducible devices on nanometer scales, if you persevere, things get easier. “Once you go down to a certain scale, of course you have built-in rulers, the interatomic spacing,” says Peter Cumpson of Britain's National Physical Laboratory in Teddington, near London. But although crystal lattices may provide a ruler, measurements of force become more murky. “It's difficult to separate your measuring apparatus from the device that you try to measure,” Kotthaus says. “I can count atoms, but anything beyond that, as metrology, is difficult.”

    One way to tackle calibration issues, material matters, and length determinations is to press toward the quantum limit, says Westervelt. “If you start looking at single quantum particles, you get saved by the discreteness of the particles and counting,” he says. Building the machines that can do this is incredibly hard, but once you climb that mountain, there are verdant pastures beyond. At the quantum limit, “it gets easier again,” says Westervelt. Of course, you have to get there first.

  18. Time's Romance of the Decimal Point

    1. Robert F. Service

    Time's running out for the cesium atomic clock. To squeeze additional decimal points of accuracy out of the world's top timekeepers, researchers are turning to some of the more esoteric corners of quantum mechanics

    BOULDER, COLORADO—David Wineland perceives time much like the rest of us do: It races when you're trying to beat a deadline but crawls endlessly when you're stuck in traffic. But Wineland, a laser and atomic physicist at the U.S. National Institute of Standards and Technology (NIST) in Boulder, Colorado, has another relationship with time, one that most people simply can't fathom. He's working with his NIST colleagues to parse time more finely than anyone has ever achieved.

    That's saying something, because the makers of the first atomic clock sliced a second so crisply that the uncertainty in their measurement was a mere 0.0000000001 of a second. That was in 1949. Today's record holder, NIST's F1 cesium atomic clock, has whittled that down to 0.0000000000000007 of a second. If a clock with that level of accuracy had been set up when the dinosaurs became extinct 65 million years ago, it would have dropped little more than a second by now. Yet, if the NIST team is successful, their clock—a mixture of lasers and trapped ions—wouldn't drop a second in 30 billion years, assuming anyone is still around to check.

    Time trap.

    In this ion trap, the NIST team hopes to get aluminum and beryllium ions dancing in step.

    CREDIT: D. WINELAND

    This level of precision may seem esoteric; frankly, it is for most uses. But atomic clocks are now indispensable for a wide array of applications from satellite-based navigation systems and computer networks to managing the electrical grid and cell phone traffic. The NIST researchers and others continue to push the technology, not just in the hope of launching novel applications that invariably stem from fundamental advances in measuring time, but also because they can. As Barry Taylor, a retired NIST physicist, was fond of saying, there is no field with a greater romance for the decimal point than the measurement of time.

    All clocks rely on two basic components: a pendulum or some other “oscillator” that produces a regular set of “ticks,” and a way to count and display those ticks as the passage of time. The trick to accurate timekeeping and pushing back that decimal point is reducing the outside influences on the clock's components. Small changes in temperature, humidity, and local gravity, for example, can change the movement of a pendulum or the grinding of gears enough so that over weeks and months a mechanical clock's precision slowly veers.

    In 1945, U.S. physicist Isador Rabi suggested that the quantum-mechanical behavior of atoms could provide an oscillator that was largely immune to outside influences. Rabi noted that quantum mechanics confines electrons around atoms to certain energy states, and that hitting atoms with just the right frequency of electromagnetic radiation can cause electrons to jump from one state to another. In essence, atoms act like ultraprecise radio receivers that can tune in to a station at just one very specific frequency.

    In cesium there is one such jump, called a hyperfine transition, that occurs when the atom is hit with microwaves with a frequency of 9,192,631,770 oscillations per second. And after cesium atoms undergo this transition, they fluoresce when hit with a precisely tuned laser pulse. So to use cesium as an atomic clock, researchers scan a sample of cesium atoms with microwaves across a range of frequencies until they see it fluoresce; then they hold their microwave emitter at that precise frequency. The oscillations of the microwaves are then the ticks of the clock, and researchers use other electronic detectors to count them.

    Although cesium-based atomic clocks have been around for decades, clockmakers have continued to improve them, reducing error rates by about a factor of 10 every decade. Atomic clockmakers are also looking to new designs (see Diddams Review on p. 1318 and Margolis Report on p. 1355). Many hope to harness atomic transitions in cesium and other atoms triggered by optical lasers, which oscillate at frequencies as much as 1000 times that of microwaves and thus can dice the second into far more ticks. But one downside is that the laser-based traps that confine cesium atoms can interfere with the measurements. As a result, researchers must essentially fire the lasers through a measurement apparatus, limiting the time they can efficiently use their atoms.

    Flash dance.

    A series of laser pulses cool aluminum and beryllium ions, excite the aluminum, transfer this energy first to the motion of the ions and then to beryllium's electrons, allowing it to fluoresce.

    ILLUSTRATION: J. MOGLIA/SCIENCE

    Some atomic clockmakers are working to get around this problem by trapping small numbers of ions in electromagnetic holding pens that don't interfere with the measurements, although this only works for some kinds of ions. The NIST group has been experimenting with aluminum, which works well in the electromagnetic traps but is a poor absorber of photons from lasers commonly used to cool atomic gases to the slow-moving state needed for accurate clocks. In addition, aluminum ions don't fluoresce as readily as other ions, making their transition difficult to spot.

    But beryllium, it turns out, solves both these problems. It's easy to cool with lasers, and it's easy to spot the different energy states of its electrons: Beryllium atoms in their unexcited or “ground” state readily scatter photons with a wavelength of 313 nanometers, but they are transparent to this light when in their excited state. Unlike aluminum, however, beryllium atoms don't have an ultrasharp transition at an optical frequency from one energy state to another—bad news for an atomic clock.

    So Wineland's colleagues Piet Schmidt, Till Rosenband, James Bergquist, and Wayne Itano are borrowing a trick known as quantum entanglement, which is used in quantum computing—another specialty of the Wineland lab—to link the states of the two ions so they can take advantage of the favorable attributes of each. Here, the goal is to transfer the atomic-transition information from the good clock atom, aluminum, to the good detection atom, beryllium.

    The researchers start by trapping both an aluminum and a beryllium ion in a tiny chamber. These ions are further confined by a electromagnetic field that pushes them back toward the center every time they try to escape. Next, they use standard laser cooling methods to chill the ions to a fraction of a degree above absolute zero. Even though the aluminum ion doesn't readily absorb laser light to slow it down, it continually bumps into the beryllium ion, transferring some of its kinetic energy. And the beryllium ion, acting as a refrigerator for the aluminum, is then cooled with the laser.

    Once trapped and cold, the ions are blasted with another laser, firing photons with a 267-nanometer wavelength. This time it's the aluminum ion's turn to go to work. The absorbed light kicks its electrons into a quantum-mechanical no-man's land called a superposition of states: part ground state and part excited state. Another laser pulse knocks aluminum back to its ground state and transfers its excitation energy into a change in motion of the combined aluminum and beryllium ions, which in their ultracold state move in concert like a single molecule. Next, another pair of laser pulses transfers this energy of motion to a superposition state of the beryllium ion, which in turn affects the fluorescence of beryllium in response to pulses from a final detection laser. Once the experiment is tuned up and working, the group uses the oscillation of the aluminum excitation laser as the ticking of the clock with 1.1 quadrillion ticks per second.

    Although Wineland says he's been kicking around the idea of using quantum entanglement in clocks for 10 years, he has only recently created the setup. Early indications suggest that it works, and the NIST team is working to nail down their results and tweak their laser setup to find the best clocklike energy transition in aluminum.

    If the early hints hold up, it would certainly be a significant step, says Kurt Gibble, an atomic physicist at Pennsylvania State University, University Park. For now, however, he says it's still too early to tell whether this approach or others will ultimately push the decimal place the furthest. “The payoff is that the accuracy and stability should be record breaking,” Wineland says. “We've been throwing around numbers like 10–18,” which corresponds to measuring time in quintillionths of a second. “But talk is cheap. We haven't done that yet,” Wineland adds. If they do, it will create a new standard for measuring time that will not only be hard to beat but hard to fathom as well.

  19. Putting the Stars in Their Places

    1. Govert Schilling*
    1. Govert Schilling is a writer in Amersfoort, the Netherlands.

    Europe's Hipparcos set out to fix the positions of the stars with never-before-achieved accuracy. It did for most stars, but pinning down the distance of the Seven Sisters proved something of a headache

    Astrometry, the science of measuring the positions of stars, has come a long way since Ptolemy. The Egyptian astronomer cataloged the positions of 1022 stars with a maximum accuracy of one-sixth of a degree. Today's best star catalog, compiled by the European Space Agency's (ESA's) Hipparcos satellite, lists the positions of a million stars, some to milli-arc-second accuracy. But Hipparcos has also sown confusion and uncertainty since astronomers discovered that its measurement of the distance to the best known star cluster in the sky—the Seven Sisters, or Pleiades—differs from their conventional distance estimate by 10%. The discrepancy could have repercussions for measuring the size of the universe.

    Distance measurements are notoriously difficult in astronomy. Astronomers can get an accurate fix on stars within about 200 light-years of Earth using parallax: an apparent shift in the position of a nearby star against the unmoving background of distant stars, caused by Earth's motion around the sun. But the tiny parallax shift—usually a minute fraction of an arc second—becomes unmeasurably small for more distant stars. Parallax measurements show that a cluster known as the Hyades is 151 light-years away. On the assumption that stars in the Hyades and Pleiades are similar, 20th century astronomers compared their brightness and colors and calculated that the Seven Sisters are 440 light-years from Earth. Ever since, this distance estimate has been one of the steppingstones in determining the scale of the universe.

    Enter Hipparcos, a revolutionary $300 million astrometry satellite launched by ESA in 1989. Using two precision telescopes, the rotating satellite mapped out a very accurate measurement grid on the sky from which it could derive stellar positions. Hipparcos plotted the positions of 1 million stars to a precision 15,000 times as accurate as Ptolemy's. For 118,000 stars, the accuracy was boosted another 20 times to milli-arc-second level, and precise parallax distances could be derived for stars out to hundreds of light-years. The resulting catalog, published in 1997, is the most accurate database of stellar positions ever produced. “Hipparcos put a lot of order into the slightly chaotic field of astrometry,” says Hipparcos project scientist Michael Perryman of ESTEC, ESA's R&D facility at Noordwijk, the Netherlands.

    But while compiling the catalog, Hipparcos team member and star cluster expert Floor van Leeuwen, of Britain's Cambridge University, found that Hipparcos's parallax distance of the Pleiades was at most 400 light-years—10% smaller than the accepted value. The team had no reason to doubt the satellite's remarkable result. After all, explains Perryman, parallax is a purely geometric effect, whereas the older distance estimates relied on the astrophysical assumption that the Pleiades and the Hyades are similar. “Before you throw your hands up in the air and say that Hipparcos is wrong, you'd better be careful,” he says.

    Astrophysicists, however, were confident of their understanding of the Pleiades and instead cast doubt on Hipparcos. Early this year in Nature, for instance, astrophysicist Bohdan Paczynski of Princeton University suggested that the problem could be due to Hipparcos's highly eccentric orbit, the unintended result of a problem during the launch. In the same issue, Xiaopei Pan and colleagues at the California Institute of Technology in Pasadena presented new evidence from the Palomar Testbed Interferometer that one of the brightest Pleiades, called Atlas, is indeed 440 light-years distant. Perryman charges that jealous American astronomers are maligning Europe's groundbreaking satellite. Paczynski's remarks were “a comment given on the basis of ignorance,” Perryman says. As for the Caltech team, he counters, “there's no guarantee that Atlas is at the center of the cluster, as Pan and his colleagues assume.”

    Galactic positioning system.

    Hipparcos used two telescopes to map a precise grid onto the sky.

    CREDIT: ESA

    In a paper accepted for publication in Astronomy & Astrophysics, however, a team led by Susan M. Percival at Liverpool John Moores University in the U.K. convincingly shows that Hipparcos's mean distance to the Pleiades is indeed in error. Using visible and infrared observations of the cluster's stars, Percival and colleagues prove that the colors, luminosities, and chemical makeup of the stars make sense together only at a distance of 436 light-years. “It seems rather convincing,” Perryman concedes.

    Van Leeuwen now believes he may have found the reason for Hipparcos's error. The Pleiades constitute a “weak spot in the catalog,” he says. He thinks the strong concentration of bright stars in the cluster threw off Hipparcos's delicate surveying strategy, resulting in a systematic error on the order of 1 milli-arc second. “There may be 10 or 20 other weak spots in the catalog,” he says. “Most of them are other star clusters.”

    Van Leeuwen is now meticulously reanalyzing the huge Hipparcos data set and hopes to publish a new catalog sometime in 2005. “I haven't a clue yet what the new distance value for the Pleiades will be,” he says. Perryman stresses that overall the Hipparcos results are still very reliable. “We'll end up with a catalog almost exactly the same,” he says. “We're talking about extremely small, very local effects.”

    In 2009 and 2011, two new astrometry missions will be launched: NASA's Space Interferometry Mission and ESA's Gaia. Both are designed to provide much higher precision than Hipparcos, in the micro-arc-second range. Van Leeuwen says that important lessons have been learned from Hipparcos: Understanding and solving the Pleiades problem is of paramount importance for the data-analysis routines for Gaia. “No one ever realized that the system was so extremely sensitive.”

  20. In the Blink of an Eye

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    Researchers want to freeze-frame the workings of atoms with laser pulses just billionths of a billionth of a second long. But first they must prove they really can produce a blast that short

    To photograph something that happens very quickly, you need a camera with lightning-fast shutter speed. But to study how molecules behave and interact, no shutter is fast enough. Instead, for a couple of decades, researchers have used flashes of laser light little longer than a femtosecond; that's just a millionth of a billionth of a second. Now they want to go even quicker. Over the past few years, scientists have passed the femtosecond frontier and are measuring their pulses in attoseconds: billionths of a billionth of a second.

    Such flashes should allow researchers to see inside an atom by freeze-framing the motion of an electron around the nucleus. They haven't got there yet—researchers are still learning how to produce the laser pulses cleanly and to measure their length—but they are looking forward to their first snapshots of the atom's interior. “I hope we will discover something we haven't even dreamed of,” says Ursula Keller of the Swiss Federal Institute of Technology in Zurich.

    The light fantastic.

    Superfast laser pulses could open a new window on the workings of matter.

    CREDIT: FERENC KRAUSZ/MAX PLANCK INSTITUTE

    A host of phenomena in molecules, atoms, and even solid matter takes place at attosecond time scales. “The dynamics of the electrons [during ionization] is much faster than a femtosecond. At one point they have to decide on which ion they go and sit, and this happens very fast,” says Keller. After a decade of work, researchers are only just getting a glimpse of such processes. “At the moment there are very few systems that are able to make these measurements,” says Ian Walmsley of the University of Oxford.

    It's impossible to make an attosecond pulse with visible light because its wavelength lasts more than a femtosecond, and so the pulse would be less than a wavelength long. But in the early 1990s, several researchers suggested a way to make short pulses using shorter wavelengths, in the extreme ultraviolet (XUV) ranges. The technique, known as high-order harmonics generation, involves hitting atoms of a rare gas with a powerful femtosecond pulse from an infrared laser. As the electric field component of the infrared pulse oscillates back and forth, it rips electrons off the atoms, and then it smashes them back into the nucleus. As the electrons return to the ground state, they emit a burst of radiation that is a combination of higher harmonics of the applied infrared frequency. The result is a sharp attosecond-long XUV pulse.

    Anne L'Huillier, now at Lund University in Sweden, pioneered this technique during the 1990s while working at the French Atomic Energy Commission's Saclay research center at Gif-sur-Yvette. But at first, researchers were only able to make strings of attosecond pulses about 1.3 femtoseconds apart. To get a snapshot of events inside the atom, they needed clean, isolated attosecond pulses. Part of the problem was that the infrared pulses used to make the attosecond flashes were themselves untidy and chaotic. The shape of the pulses—how the amplitude of the radiation rose to a peak then subsided again—bore no relation to the electromagnetic waveform that oscillated within it.

    Researchers needed infrared pulses in which the maximum of the electromagnetic wave coincides with the maximum of the pulse envelope. Only that peak electromagnetic wave has the intensity to generate an XUV burst. In 1999, Keller's group proposed a way to make such a wave using a feedback mechanism that detects the state of the electromagnetic wave and tweaks the laser that produces it. But it was Ferenc Krausz of the Max Planck Institute for Quantum Optics (MPQ) in Garching, Germany, who turned theory into reality. In 2003, while he was at the Technical University of Vienna, his group reported neat single XUV pulses. “The Vienna-MPQ group is now clearly the leading group in this area. They have a system that works, and it works well,” says Walmsley.

    Although the pulses were undoubtedly short, Krausz and his team still had to prove that they were less than a femtosecond long. Earlier this year Krausz employed a technique known as a “streak camera” to measure the pulse length. He and his colleagues directed an XUV flash at a target of neon atoms. The pulse tears electrons from these atoms, and then the electric field of a second, infrared light pulse sweeps them sideways into an electron detector. From the energy distribution of these electrons, the researchers could determine the duration of the x-ray pulse—a speedy 250 attoseconds.

    To demonstrate what attosecond pulses can do, Krausz and his team used them to make a waveform of light visible (Science, 27 August, p. 1267). In a technique they've dubbed the “light oscilloscope,” the team ejected electrons from some atoms by blasting them with an attosecond XUV pulse and then hit those electrons with a femtosecond infrared pulse. During the small time window of 250 attoseconds, the electric field associated with the infrared light wave accelerates these electrons, which are then captured by a detector. From their arrival times and energies, the team could deduce the shape of the infrared light wave.

    Several groups in Europe, North America, and Japan are now gearing up to do similar research, says Walmsley. “The tools and techniques of attosecond metrology are now really ready,” says Krausz. One such team is led by theoretical physicist Thomas Brabec of the University of Ottawa. “We are working on potential applications in atoms and clusters … because [clusters] are the transition between atoms and condensed matter,” he says. And Ahmed Zewail of the California Institute of Technology in Pasadena, who received the 1999 Nobel Prize in chemistry for his pioneering work in femtochemistry, is now also looking through this new window at matter in a state never seen before: “If you can catch any system in a very short time, then you are far from the equilibrium state of these systems, and in refining more and more the time resolution, you will find some interesting phenomena,” says Zewail.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution