News this Week

Science  03 Feb 2006:
Vol. 311, Issue 5761, pp. 588
  1. SCIENCE EDUCATION

    Strategies Evolve as Candidates Prepare for Kansas Board Races

    1. Yudhijit Bhattacharjee

    LAWRENCE, KANSAS—Billboards touting everything from steak to flat-screen TVs assault drivers speeding along I-35 across the American Midwest. But an unusual sales pitch pokes out of the ground just as the interstate leaves Missouri and enters Kansas. “Evolution is a fairy tale for grown-ups,” the sign proclaims, steering viewers to a Web site that mocks the idea that evolution can explain the origin of human beings.

    Legal talk.

    Lawyers for the Dover, Pennsylvania, plaintiffs joined Kansas Citizens for Science in Lawrence last week in attacking the state's new science standards.

    CREDIT: Y. BHATTACHARJEE/SCIENCE

    That Web site (scienceprovesit.com) is a stark reminder of what scientists and educators face as they battle new state science standards casting doubt on evolutionary theory and effectively opening the door to intelligent design (ID) and creationist instruction in Kansas public schools. The terms of four of the six-person majority on the state school board that adopted those standards 3 months ago end this year, and defenders of evolution hope voters will choose moderates in their place who will work to have those standards thrown out. Last weekend, a rally in Lawrence by Kansas Citizens for Science (KCFS) served as a de facto kickoff to these candidates' campaigns.

    But moderates hoping to unseat the incumbents say a frontal assault on the new standards would be self-defeating in a state where conservative voters may sympathize with ID even though they have no appetite for the fundamentalist, right-wing groups that have led the charge against evolution. Instead, they plan to attack other board actions that they believe are unpopular with voters, including state-funded vouchers for private schools and a newly appointed education commissioner whose qualifications have been questioned. They also hope to draw on the recent U.S. District Court ruling that threw out ID language inserted by the Dover, Pennsylvania, school board, calling the attempt an unconstitutional intrusion of religion into the classroom and ordering it to pay what are expected to be significant legal fees.

    Battlefield Kansas.

    State school board candidate Harry McDonald faces a challenge from antievolution groups such as the sponsors of this billboard.

    CREDIT: Y. BHATTACHARJEE/SCIENCE

    “If I were to make the new science standards the focus on my campaign, it's very likely that I would lose,” says Harry McDonald, a former biology teacher who is running against incumbent John Bacon, who voted for the new standards, in the 1 August Republican primary. Republican Sally Cauble, a former elementary school science teacher contesting a seat held by strident ID supporter Connie Morris, echoes the thought. “You have to watch out for that strong undercurrent of support for faith-based education, including intelligent design,” she says. McDonald and Cauble say they won't hide their proevolution stance but that they'd prefer to have voters raise the issue. “It's not mentioned in any of my campaign materials,” says Cauble.

    Supporters of evolution admit grudgingly that ID proponents have successfully framed the issue as a battle between science and religion. “It's a tricky line for the candidates to navigate,” says KCFS's Jack Krebs, who spoke at the Lawrence meeting alongside lawyers representing the Dover parents who prevailed in December (Science, 6 January, p. 34). “Since sophisticated discussions on evolution and religion are not common in our society,” Krebs says, “it's very easy for right-wing groups to brand the challengers as godless atheists.”

    A former president of KCFS who now conducts workshops for science teachers and cultures butterfly larvae to donate to schools in his district, McDonald takes care not to come across as a passionate evolutionist. His literature mentions the new standards as an example of micromanagement by the current board, which took over the writing of the standards last year after rejecting a draft submitted by the science standards writing committee. “There are ID sympathizers in my constituency who might be willing to forgive me my transgressions for being a strong science supporter because of other issues. But if I spent too much time on evolution and ID, they might not.”

    Bacon told Science he hasn't decided whether to seek reelection to another 4-year term. But he says that, should he run, he would have no qualms advertising his role in promoting the new standards, even though it would not be a centerpiece of his campaign. “I've seen polls showing that the majority of people in the state want their kids to be exposed to all theories of origin science in the classroom,” he says. “If evolution is a theory, they want it taught as a theory, not as a fact.”

    Both sides agree that, to the extent that evolution gets discussed during the electoral race, the Dover decision will certainly help the challengers. But John Calvert, the managing director of the ID Network in Shawnee Mission, says it would be unfair to compare the actions of the two boards. “The Kansas standards do not require the teaching of ID in the classroom,” he says. “What they do is give teachers the freedom to answer critical questions about evolution without fear of being leaned on.”

    Don Weiss, a dean at DeVry University in Kansas City and a Democratic challenger who would face Bacon in November should both win their primary races, says he intends to use the financial aspect of the Dover ruling as ammunition. “Either we can have a very expensive lawsuit, or we can get it taken care of through the election,” Weiss says he'll tell voters. But then he'll reclaim the high ground. “My broader message is going to be about improving the quality of education in Kansas, so that our kids can compete in a global economy.”

  2. PLANETARY SCIENCE

    New Hubble Image Cuts the "10th Planet" Down to Size

    1. Robert Irion

    LOS ALTOS HILLS, CALIFORNIA—Confounding previous estimates, the so-called 10th planet is Pluto's near-twin in size, according to a new image from the Hubble Space Telescope. The object is just a “smidge” bigger than Pluto, not 25% to 50% bigger, an astronomer reported here last week, and unusually reflective. The downsizing illustrates the quandary facing scientists as they try to define whether large residents of the frigid Kuiper belt are bona fide planets.

    Planetary scientist Michael Brown of the California Institute of Technology (Caltech) in Pasadena and colleagues found the object, designated 2003 UB313, as a slow-moving dot of light. It traces an elongated orbit out to its current farthest point of 97 times Earth's distance from the sun, making it the most remote body yet seen in our solar system. Despite its distance, the object dubbed “Xena” by Brown's team appears so bright that last July NASA described it as markedly larger than Pluto (Science, 5 August 2005, p. 859). But researchers sought better data to gauge its true size.

    One new study, published this week in Nature, favors a chubbier Xena. A team led by radio astronomer Frank Bertoldi of the University of Bonn, Germany, used the IRAM 30-meter radio telescope at Pico Veleta, Spain, to measure the object's heat emissions. Their analysis points to a diameter of 3000 kilometers, compared to 2300 kilometers for Pluto—but with substantial error bars.

    Such errors are far smaller with a direct view from orbit, Brown says. The Hubble Space Telescope zeroed in on Xena in December 2005. Brown showed the newly analyzed image to about 1000 people at a public lecture here at Foothill College. The blob of light, spanning several pixels on Hubble's detector, had enough resolution for Brown's team to determine that Xena is barely bigger than Pluto. Brown said he would reveal the calculated size at a NASA press briefing. For now, he said, “I'm going to stick with the word ‘smidge.’ It's a really good word.”

    Pluto plus.

    Distant “Xena”—shown in a ground-based image with its small moon—is barely bigger than Pluto, a new Hubble photo reveals.

    CREDIT: A. BOUCHEZ, M. VAN DAM, D. LE MIGNANT/W. M. KECK OBSERVATORY LASER GUIDE STAR ADAPTIVE OPTICS TEAM

    But the size was evident from a statistic shown by Brown: Xena reflects a remarkable 92% of optical light, like the finest fresh snow. “I had expected it to be darker and considerably larger,” Brown said. This measure, called albedo, is derived from the object's apparent brightness, distance, and diameter. According to a chart on Brown's Web site, that diameter is roughly 1% larger than Pluto's—down from the team's previous guesses of 25% larger (on the Web site) to 50% larger (at NASA's July announcement).

    Icy bodies darken with age, so geysers must recoat Xena's surface with fresh frost, Brown said. Planetary scientist David Stevenson of Caltech notes that Saturn's active moon Enceladus is the only other object in the solar system that glistens as radiantly. But Enceladus flexes during its eccentric orbit around Saturn, generating enough heat to expel icy compounds from the moon's interior. There's no obvious way to spark such action on Xena—even with its small moon. “Frankly, volcanism in the Kuiper belt is hard,” Stevenson says. “Maybe we don't understand the dynamics of crystallization and the physics of ice surfaces.”

    Nor will Xena help the messy debate over planet nomenclature. Late last year, a working group of the International Astronomical Union (IAU) failed to agree on any of three proposed “planet” definitions and passed the buck to IAU's executive committee. Astronomers are finding so many planet-like objects—both in our solar system and around other stars—that the prudent course may be to wait instead of forcing a hasty consensus, says committee member Robert Williams of the Space Telescope Science Institute in Baltimore, Maryland.

    Although people are loath to demote Pluto from planethood, they may not want dozens of Pluto-size “planets” either, says Foothill astronomer Andrew Fraknoi. “It's almost cosmic justice” that Xena and Pluto are a nearmatch, he says. “Welcome to the borderland of science.”

  3. EVOLUTION

    Hidden Genetic Variation Yields Caterpillar of a Different Color

    1. Elizabeth Pennisi

    People change clothes depending on the temperature outside. Tomato hornworms change color. These caterpillars emerge green when it's above 28°C and black when it's cooler. Now two insect physiologists report on page 650 that they have teased out a possible genetic basis of this color change by breeding a mutant strain of a related species, the tobacco hornworm, until it too undergoes a similar switch.

    The study demonstrates how species can mask effects of genetic mutations until an environmental trigger reveals them, an adaptive mechanism that may help organisms survive changing conditions. The work “is a tour de force of experimental evolutionary biology,” says Mary Jane West-Eberhard, an evolutionary biologist at the University of Costa Rica. “It [begins] to answer a question of fundamental importance: How does a novel, environmentally sensitive trait originate?”

    Fashion statement.

    Tobacco hornworms can evolve a finely tuned sensitivity to heat that causes them to emerge green instead of black.

    CREDIT: Y. SUZUKI ET AL., SCIENCE

    Organisms that live in variable environments often evolve traits—called polyphenisms—that change according to particular conditions. Aphids become winged or wingless, for example, depending on food availability. The tomato hornworm's color change serves a similar adaptive purpose. In the cooler northern United States, the caterpillars that emerge in the autumn are black to absorb more sunlight, but in the south, where camouflage is more important than heat conservation, they're green. In contrast, tobacco hornworms are typically green, no matter the temperature.

    The genetic underpinnings of polyphenisms have long been a puzzle, notes Douglas Emlen, an evolutionary biologist at the University of Montana, Missoula. To examine how the tomato hornworm's color-shifting may have arisen, Yuichiro Suzuki, a graduate student working with Frederik Nijhout at Duke University in Durham, North Carolina, turned to a tobacco hornworm mutant that is black rather than the normal green. Its mutation reduces secretion of juvenile hormone, which regulates skin coloring. This mutant strain, however, generates caterpillars with varying degrees of green if it is heat-shocked—briefly exposed to a very high temperature—at an early stage of development.

    Suzuki used this heat-shock method to select for two spinoff strains. In one case, he mated only tobacco hornworm caterpillars that remained dark despite the heat shock, weeding out greenish ones each generation. By the seventh generation, this line, even after being heat-shocked, produced only black larvae. At the same time, Suzuki bred the caterpillars that developed the greenest skin when heat-shocked. Over time, this selection had dramatic results, creating a strain whose caterpillar form always emerges green instead of black if grown above a specific threshold temperature, 28.5°C.

    The experiments indicate that low juvenile hormone levels in the original black mutant had enabled already-existing variants involved in pigment production to exert their effects, depending on the temperature. Normal tobacco hornworms have very high amounts of juvenile hormone, but Suzuki and Nijhout showed that the heat-insensitive version of the mutant strain had very little and the newly created polyphenic strain had levels in between. In this latter strain, higher temperatures resulted in more juvenile hormone and, consequently, greener skin. Suzuki and Nijhout propose that there may be other cases in which evolution has exploited developmental hormones to create polyphenic traits.

    Evolutionary biologist Mark Siegal of New York University cautions that what happens in the lab isn't necessarily what happens in real life. But he applauds the study. “[This] laboratory demonstration is an important first step that will guide the crucial, and difficult, effort to understand actual evolutionary histories,” he says.

  4. INFECTIOUS DISEASES

    Tackling Neglected Diseases Could Offer More Bang for the Buck

    1. Gretchen Vogel

    STOCKHOLM—Public health efforts in the developing world are missing out on a bargain, say a group of researchers and health policy leaders. At a meeting here* and in a recent paper, they argue that the ramped-up efforts against the Big Three—HIV/AIDS, tuberculosis, and malaria—will yield far bigger dividends if they are coupled with an attack on so-called neglected diseases such as hookworm, schistosomiasis, and leishmaniasis. These infections make their victims more susceptible to the Big Three, the researchers contend.

    Double benefit.

    Treating the ascariasis worms that had infected this girl (inset) may leave her less vulnerable to other diseases.

    CREDIT: PETER HOTEZ/GEORGE WASHINGTON UNIVERSITY

    Up to seven neglected tropical diseases could be tackled for just 40 cents per person per year, they say. “It's the best buy in public health at the moment,” says Alan Fenwick, a schistosomiasis researcher at Imperial College London.

    Unlike HIV and malaria, lymphatic filariasis and onchocerciasis do not trip off the tongues of world leaders. Nor do such neglected diseases directly kill as many people as the Big Three. Instead, they take their toll more insidiously, through stunted growth, anemia, and blindness, contributing to widespread developmental and learning delays. These infections, both bacterial and parasitic, “are the world's leading cause of growth deficits and the world's leading education problem,” says Peter Hotez, a parasitologist at George Washington University in Washington, D.C.

    But neglected tropical diseases are vulnerable to a concerted campaign. Effective drugs—inexpensive or donated by drug companies—are available against many of them. And in a paper published 30 January in the Public Library of Science Medicine, Hotez, Fenwick, and their colleagues argue that treating the 500 million people afflicted would cost just $200 million a year—compared to $500 million pledged this year for antimalaria efforts.

    At the same time, the authors argue, treating these seven diseases—the helminth infections ascariasis, trichuriasis, hookworm, lymphatic filariasis, onchocerciasis, and schistosomiasis, and the bacterial infection trachoma—might benefit the ongoing fight against the Big Three. They point to a growing body of evidence that suggests that populations infected with multiple parasites are more susceptible to other diseases—including the big killers.

    The payoffs for malaria control might be especially worthwhile. Intestinal parasites are a leading cause of anemia—exacerbating one of the main complications of severe malaria. Hotez points to a study in Senegal that found that deworming medicines significantly reduced malaria cases.

    There is also preliminary evidence that HIV patients infected with multiple parasites have higher viral loads and lower immune cell counts than their counterparts who are wormfree. And several studies have shown that worm infections can lessen the effectiveness of vaccines against other diseases. “Unless we do something about polyparasitism, we are not going to have a big impact on the Big Three,” Hotez says.

    On a practical level, the infrastructure for distributing deworming drugs could also be used to deliver antimalarial bed nets. The researchers hope to put their ideas into practice soon. At this week's meeting, researchers and public health leaders from eight African countries met to devise a “quick impact initiative” that would create national programs to tackle malaria and the neglected diseases together.

    Getting drugs where they are most needed is the greatest challenge, says William Lin of Johnson & Johnson. Lin is in charge of his company's effort to donate 50 million doses of mebendazole, used to treat hookworm and other helminths. “I've asked them to ramp up production,” he says. “I don't want to be left at the end of the year with stores in the warehouse—and egg on my face.”

    • *U.N. Millennium Project: A Malaria and Neglected Tropical Diseases Quick-Impact Initiative, 30–31 January, Stockholm, Sweden.

  5. GLOBAL WARMING

    Climate Change Demands Action, Says U.K. Report

    1. Daniel Clery

    CAMBRIDGE, U.K.—As climate change climbs up the political agenda, researchers have pooled much of the most recent research into what many believe is a compelling case for the immediacy of global warming.

    This week's report,* based on a meeting convened last year at the request of U.K. Prime Minister Tony Blair, warns of catastrophic consequences if steps are not taken now. It says a range of measures, from emissions trading to nuclear power, are needed to both minimize future impacts and cope with those that cannot be avoided. “It is clear from the work presented that the risks of climate change may well be greater than we thought,” says Blair in a foreword to the report. “The U.K. government is taking this issue very seriously,” says glaciologist David Vaughan of the British Antarctic Survey, “and it's nice to see the government consulting scientific opinion.”

    During 2005, Blair was both chair of the G8 leaders of industrial powers and president of the European Union and pledged to use his twin roles to combat global poverty and climate change. To advance the climate initiative, 200 researchers from across the globe met at the Hadley Centre for Climate Prediction and Research in Exeter last February. The meeting came 4 years after the last assessment report from the Intergovernmental Panel on Climate Change (IPCC)—the benchmark for global warming—and the scientists chewed over new results. “It was a good time to take stock,” says steering committee chair Dennis Tirpak, head of the climate change unit at the Organisation for Economic Co-operation and Development in Paris.

    According to the meeting report, “compared to the [IPCC's 2001 assessment], there is greater clarity and reduced uncertainty about the impacts of climate change.” The report contains models showing how the acidity of the oceans will increase as a result of more carbon dioxide in the atmosphere. It also forecasts a 1000-year rise in sea levels as a result of thermal expansion of the oceans and melting of the Greenland and Antarctic ice sheets, even if greenhouse gas emissions are stabilized. “Once peripheral melting is under way around Greenland,” Vaughan says, “the ice sheet may enter a state where it can't sustain itself.”

    Tirpak says politicians need to realize that time is running out and that the next generation may live on a planet that has no icecaps in the summer months. “It will be a profoundly different world, and we cannot imagine what that will mean,” he says. “Do you want to risk the consequences?”

  6. BIOMEDICAL RESEARCH POLICY

    NIH Lends a Hand to Postdocs Seeking to Become Independent Researchers

    1. Jocelyn Kaiser

    Concerned about the graying of the investigators it funds, the National Institutes of Health (NIH) last week unveiled a new “bridge” grant to help postdocs become independent researchers. Individuals could receive nearly $1 million over 5 years to cover research and training expenses. The first awards will be made next fall.

    Even in tight budget times, “nothing is more important than supporting the new investigators early,” said NIH Director Elias Zerhouni of the $390 million program. The funding will come from taking a “sliver” of each institute's overall budget, Zerhouni says. The chair of a 2005 National Academies panel that recommended the award's creation is delighted with the result. “This is exactly the sort of thing we were hoping for,” says Thomas Cech, president of the Howard Hughes Medical Institute in Chevy Chase, Maryland.

    Bucking a trend.

    NIH hopes new grants will boost the share of competing research grants now going to new investigators.

    SOURCE: NIH

    The average age of a Ph.D. investigator winning his or her first research grant, called an R01, has risen from 37 to 42 in the past 25 years. Nearly a decade ago, NIH abandoned a smaller research award for young investigators because it didn't seem to help scientists get R01s. Now NIH is trying again.

    The Pathway to Independence award combines traditional training and research grants (Science, 9 December 2005, p. 1601). The first 1 or 2 years cover the completion of a postdoc, at $90,000 per year (including 8% for overhead costs). Grantees who win a position as a tenure-track assistant professor can then apply for up to $250,000 a year for 3 more years for research. NIH says non-tenure track research faculty members are also eligible. The hope is that these investigators will then be in a good position to win R01s.

    The research portion of the grant will cover full overhead costs, which can be as high as 50%. That feature should give universities a strong incentive to create positions for these investigators, Zerhouni says. “This is going to make it a lot easier for postdocs to get a faculty position because they're bringing so much money with them,” adds Alyson Reed, executive director of the National Postdoctoral Association, which had also recommended the award's creation.

    NIH hopes to award 150 to 200 fellowships a year in the next 6 years to postdocs sponsored by their institutions. “That's enough to really make a difference,” says Cech. Indeed, NIH hopes that the new award will help boost the share of R01s going to new investigators from 20% to 25% (see graph). Cech says it's also important that non-U.S. citizens are eligible and that the grants can be transferred to other institutions.

    NIH is still weighing another recommendation from the academies panel for a new-investigators R01 program with grants based on experience rather than data. NIH's environmental health institute has begun a pilot project to test the idea.

    (Also see related article on Science Careers.)

  7. THEORETICAL PHYSICS

    Ring Around a Quasar May Deflate Quantum Foam After All

    1. Adrian Cho

    A halo in an image of a distant galaxy rules out some conceptions of the frothy “quantum foam” thought to make up space and time at the smallest scales, a team of physicists claims. If true, the observation clamps the first experimental limit on quantum gravity, the highly theoretical field that strives to marry quantum mechanics and Einstein's general theory of relativity.

    Bubble burster.

    Irislike “Airy ring” around quasar PKS 1413+135 (black dot, center) may nix some versions of quantum foam.

    CREDIT: E. S. PERLMAN ET AL., ASTRONOMICAL JOURNAL 124, 2401 (2002)

    Ironically, 3 years ago the team shot down a similar claim that quantum foam would obliterate the optical artifact. But the new analysis takes into account a physical effect the previous work missed, and others say it appears sound. “I looked at it as carefully as I could, and I could not find any obvious mistake,” says Eric Perlman, an observational astrophysicist at the University of Maryland, Baltimore County.

    The halo appears around a quasar—the fiery heart of a galaxy—in an infrared image the Hubble Space Telescope snapped in 1998. The “Airy ring” arises because light waves distort slightly as they bounce off the edges of a telescope's mirror, in ways that create rings around any pointlike object.

    But the effect occurs only if the waves remain neat and orderly as they travel the 4 billion light-years from the quasar. Quantum foam would fuzz them out, say theoretical physicist Yee Jack Ng and colleagues at the University of North Carolina, Chapel Hill. So the halo rules out the most chaotic models of the foam, they argue in a paper to be published in Physical Review Letters. That's a big claim, as most theorists agree that the foam is an unavoidable consequence of melding quantum uncertainty with Einstein's notion that spacetime is stretchy and dynamic.

    In 2003, astrophysicists Richard Lieu and Lloyd Hillman of the University of Alabama, Huntsville, also concluded that the halo torpedoed the notion of quantum foam. Within the minuscule foam, concepts of distance and duration lose precise meaning. As a result, Lieu and Hillman argued in the Astrophysical Journal, quantum foam should create an uncertainty in how far the light from the quasar had to travel to reach Hubble. That uncertainty should blur the wave fronts and eliminate the ring.

    But Lieu and Hillman assumed that the uncertainty would grow in proportion to the distance to the quasar. Ng and colleagues pointed out in the Astrophysical Journal that if the foam varied randomly, then the uncertainty would increase in proportion to the square root of the distance, making the effect imperceptibly small.

    Now, Ng and colleagues have considered uncertainties not only in the distance the light travels but also in its direction, which changes as the light scatters off the “bubbles” in the foam. The scattering greatly increases the blurring effect, Ng says. So the presence of the ring rules out a randomly varying quantum foam after all. Less random versions could still exist, however, as fluctuations in the foam might conspire to reduce the blurring.

    “In my view, this is a compelling argument,” says Demos Kazanas, a theoretical astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland. Giovanni Amelino-Camelia, a quantum gravity theorist at the University of Rome “La Sapienza” in Italy, says, “I'm starting to believe that I should invest in this” line of inquiry.

    As for Hillman, who died in 2005, and Lieu, “their suggestion was a good one, even if their argument was flawed,” Ng says. Researchers can test the idea by looking for halos in images of other quasars, and larger telescopes should be able to detect smaller effects of the foam and probe other theoretical models.

  8. U.S. INNOVATION

    Bandwagon Builds for Energy Research

    1. Eli Kintisch

    Influential Washington policymakers have decided that bolstering U.S. technical know-how and tackling energy challenges should go hand in hand. Their solutions are featured in a series of recent legislative proposals, including the bipartisan Protecting America's Competitive Edge (PACE) package, introduced in the Senate last week. The more-than-$70-billion package, like several other bills introduced in December, includes more money for researchers and science educators funded by the Department of Energy (DOE).

    Sunny and hot forecast.

    New funding proposals would boost energy research for areas such as photovoltaics and inherently safe nuclear power (shown here, a decommissioned plant in Herald, California).

    CREDIT: NREL

    The rapid economic development in India and China, a stagnant U.S. manufacturing base, and the poor performance by U.S. students on standardized tests in math and science have spurred a surfeit of recent legislative plans to tackle domestic competitiveness. Meanwhile, the rising demand for oil, tensions in the Middle East, and concerns about carbon emissions are pushing lawmakers to accelerate the development of new energy technologies. Both challenges were mentioned in a National Academies report released last fall (Science, 21 October 2005, p. 423). Previewing his State of the Union address earlier this week, President George W. Bush told Bob Schieffer of CBS News that an effort “to promote and actively advance new technologies” could make the U.S. “independent from foreign sources of oil.”

    That rhetoric signals the demise of an era in which “congressional support of science was built on the pillars of defense and health,” says former Massachusetts Institute of Technology president Charles Vest, who predicts that energy-environment, competitiveness-innovation, and health will be the new drivers of research funding.

    Some would like to recreate the excitement of the Apollo space program in the 1960s by picking a challenging technological target that could weld research with national priorities. Norman Augustine, former chair and CEO of Lockheed Martin, chaired the academies' panel, which considered a so-called National Energy Initiative. Likewise, lawmakers crafting the PACE act at one point toyed with targeting development in specific energy areas such as nuclear energy. But the “decision was to let that happen [naturally],” says PACE co-sponsor Senator Pete Domenici (R-NM).

    That approach is fine with Augustine. A focus on energy “happens to coincide with physics, engineering, and math,” he says. Both PACE and the academies report also call for a 10%-a-year boost in federal funding for basic research.

    PACE would give DOE an increased role in encouraging college students to major in science and engineering and improving training for science and math teachers at all levels through new scholarships. It also calls on DOE's national laboratories to support summer internship programs for gifted students. Insiders say Raymond Orbach, head of DOE's Office of Science and a former university president, helped persuade lawmakers to give DOE a larger national role in science education.

    One proposal in several of the bills is a new DOE research agency modeled on the Pentagon's Defense Advanced Research Projects Agency. Aimed at encouraging risky, high-payoff energy science, the new agency, dubbed ARPA-E, would recruit academic and industrial leaders for short periods to craft and manage innovative research initiatives. Nobelist Steve Chu, director of DOE's Lawrence Berkeley National Laboratory in California, says that such an agency would help “bridge the funding gap” that now exists between well-established yet risky science, such as fusion research, and basic work with hard-to-anticipate benefits, such as that in particle physics. ARPA-E is also part of a package of bills introduced in December by Representative Bart Gordon (D-TN), ranking Democrat on the House Science Committee, and a recent proposal by Senate Democrats. Although not mentioned by name, the approach is also endorsed in a December innovation bill introduced by Senators John Ensign (R-NV) and Joe Lieberman (D-CT).

    These legislative proposals may reflect a convergence of thinking in Congress. But supporters will also need to convince spending panels. Advocates don't see that as an insurmountable obstacle. PACE co-sponsor Senator Lamar Alexander (R-TN), for example, calls PACE's multibillion-dollar cost “a small price for a high standard of living.”

  9. SCIENTIFIC CONDUCT

    Panel Discredits Findings of Tokyo University Team

    1. Dennis Normile

    TOKYO—A University of Tokyo chemist has been stripped of his teaching duties and his graduate students following an investigation unprecedented in Japanese academia. Last week, university officials announced that a group led by Kazunari Taira has been unable to reproduce findings from four key papers. Taira maintains he has done nothing wrong aside from failing to ensure that experimental data were properly recorded. The headline-grabbing case is likely to spur other institutions to establish procedures for handling misconduct allegations.

    Case closed?

    A University of Tokyo panel has concluded that certain findings from chemist Kazunari Taira's team could not be substantiated.

    CREDIT: D. NORMILE

    An investigation began last spring after the RNA Society of Japan wrote to the university raising questions about 11 papers that appeared between 1998 and 2004 in Nature, Nature Biotechnology, the Proceedings of the National Academy of Sciences, and other journals. The society acted on reports from scientists in Japan and from overseas saying they could not replicate the group's results, sources say. Hiroaki Kawasaki, a research associate in Taira's lab, was first author on 10 of the 11 papers. Taira was corresponding author of nine papers; he and Kawasaki were co-authors of the other two.

    A panel led by Yoichiro Matsumoto, a mechanical engineer in the Graduate School of Engineering, was formed to probe the RNA Society's concerns. In an interim report released last September, the committee announced that a number of specialists contacted by the panel claimed they were unable to reproduce Taira's results. The committee then selected four papers for a closer look and found that the group could not produce raw data or notebooks to support the findings (Science, 23 September 2005, p. 1973).

    Taira insisted that he could repeat the experiments, so the committee asked him to do so. Kawasaki claimed to have replicated the findings in one of the papers, but the panel found that he had used materials different from those described in the original paper. Taira says more time is needed to work on the other experiments. However, at a 27 January media briefing, Matsumoto said bluntly, “At this time, there is no evidence the experiments can be repeated.”

    Junichi Hamada, a university vice president, said at the press briefing that both Taira and Kawasaki will now face a disciplinary committee and could be dismissed. In the meantime, the Graduate School of Engineering has relieved Taira of teaching duties and transferred his 25 graduate students to other teams. His own research has ground to a halt, and he says he will have to restart his career “from scratch.”

    “If I was just making up data, I wouldn't have had to work the 100 hours a week I was working,” says Taira, whose recent studies involve RNA. But he concedes that his group is having trouble reproducing some results. The investigation was the first ever by the University of Tokyo, widely considered to be Japan's most prestigious. The university is mulling the establishment of a permanent committee or office to address research misconduct, says panel chair Matsumoto.

    Observers say they are pleased with the outcome. “The University of Tokyo should be highly praised for its handling of this investigation,” says Norihiro Okada, a molecular biologist at the Tokyo Institute of Technology and one of the members of the RNA Society who urged an inquiry.

    Okada and others believe that the case has focused attention on the need for more policing of misconduct. RIKEN, the nation's premier collection of basic research institutes, is ahead of the game. Its auditing and compliance office, created last April, now has the authority to investigate any hints of misconduct. Each RIKEN group must now make experimental records available for inspection for 5 years after publication, and the contributions and responsibilities of every author must be made clear. Office director Fumikazu Kabe says the policy might have to be modified for adoption by universities, “but it probably is something they could use as an example.”

  10. NEUROSCIENCE

    A Timely Debate About the Brain

    1. Yudhijit Bhattacharjee

    Neuroscientists have recently shown that multiple brain regions are used to judge short intervals, but fierce disagreement continues over how neurons in those regions measure time

    Any driver will agree that a yellow light at a traffic intersection presents a conundrum. Should one hit the brakes to stop or keep going—speeding, if necessary, to beat the red light? A number of factors could influence the choice, including the degree of recklessness of the driver, the urgency of the trip, and, not least, whether a police car is in sight. But the key element in the decision is the person's estimate of how much time it will take for the signal to turn red.

    Time is integral to myriad other questions in everyday life: how long to grill one side of a burger before flipping it, how long to let a phone ring before hanging up, or how long to wait during a pause in conversation before treating it as a speaking cue. In both people and animals, the brain's ability to keep track of intervals is fundamental to innumerable behaviors. Some, such as walking and singing, rely on timing on the order of tens to hundreds of milliseconds. Others, such as foraging and making decisions, including the yellow-light problem, involve judgment of intervals on the scale of seconds to minutes and hours. As Warren Meck, a cognitive neuroscientist at Duke University in Durham, North Carolina, puts it: “Timing is everything.”

    For decades, researchers have sought to uncover the neural basis of time perception. They've been motivated in part by success at understanding the circadian clock: the biological timer that regulates the day-night cycle. In mammals, this 24-hour timepiece has a specific home: the brain's hypothalamus. Not surprisingly, scientists have hoped to discover a localized structure somewhere in the brain dedicated to tracking shorter time intervals. But now, timing researchers are all but abandoning the search for such an interval timer in any single region of the brain. Instead, they are increasingly convinced that the brain judges intervals on short time scales—milliseconds to minutes and hours—with the help of a distributed network of neurons. This shift is being driven by a slew of findings from electrophysiological studies on animals, behavioral experiments involving patients with brain lesions, and neuroimaging studies of healthy people.

    In addition to identifying the different brain regions that play a role in timing, these experiments are prompting scientists to reexamine the classic view of how neurons keep track of time. And even though that has not yet led to a mechanistic account that satisfies everybody, researchers say the effort is helping to take timing research beyond the speculative realm of psychology into the firmer territory of neuroscience. “We're finally getting some neural reality into the picture,” says Russell Church, a psychologist at Brown University, who has studied timing for more than 30 years.

    A distributed timekeeper

    Inspired by the hypothalamic circadian clock, researchers began looking for a short-time-scale clock in the brain in the 1970s. Some focused on the hippocampus, assuming that time perception was related to memory. Others searched the cerebellum. By the mid-1990s, many were convinced that the clock was located in the basal ganglia.

    Yet in recent years, neuroscientists have linked multiple areas throughout the cortex to time perception. Some evidence has come from neuronal recordings in animal brains. In 2003, for example, Michael Shadlen, a neuroscientist at the University of Washington, Seattle, and his graduate student Matthew Leon reported training monkeys to make eye movements based on duration judgments in the range of 0.3 to 1 second. The two found that neurons in the animals' posterior parietal cortex increased their firing rate based on how much time had elapsed. The results suggested that these neurons track the flow of time relative to a remembered duration. Other teams of researchers, including one led by Yoshio Sakurai of Kyoto University in Japan, and a group led by Carlos Brody at Cold Spring Harbor Laboratory in New York, have observed similar patterns of neuronal activity in the prefrontal cortex of monkeys performing timing tasks.

    Evidence for the involvement of different cortical areas in timing has also come from studies of patients with brain lesions. In 2002, a team led by Giacomo Koch, then at Italy's University of Rome, reported on a patient with a prefrontal cortical lesion who underestimated durations of a few seconds—time to him seemed to pass more quickly than it actually did. The same year, a group led by Marc Wittmann of Ludwig Maximilians University in Munich, Germany, described patients with lesions in other cortical areas who also underestimated durations longer than 3.5 seconds. Then in 2003, Koch and his colleagues showed that they could induce healthy subjects to underestimate multisecond intervals by suppressing their prefrontal cortices with a focused magnetic field.

    Some of the clearest evidence for a distributed picture of timing has come from neuroimaging studies. In one such study, researchers in France asked 12 subjects to compare the color and duration of two circles presented one after the other on a computer screen (Science, 5 March 2004, p. 1506). Each circle was colored one of three shades of purple and stayed on for one of three durations: 0.5, 1, or 1.6 seconds. In some trials, the subjects had to indicate if the second circle was bluer or redder than the first; in others they judged if the second circle appeared for a longer or shorter duration.

    Functional magnetic resonance imaging scans of the volunteers showed activation of an extensive network of brain areas during the time estimation task; in contrast, only the V4 area of the visual cortex lit up during the color-judgment task. Also, the various areas that lit up during the timing task—including the prefrontal and parietal cortices and the basal ganglia—showed increases in activity as the subjects paid more attention to time.

    “Although visual features such as color or motion or form can be linked to single-feature-specific processing areas, timing information appears to be coded in a distributed network of brain regions,” says Jennifer Coull, a cognitive neuroscientist at the Centre National de la Recherche Scientifique in Marseille, France, and lead author of the study. “Maybe we have to integrate several sources of information in order to estimate time because it is so much less tangible to our senses than visual features.”

    In the August issue of Human Brain Mapping, a different French group led by Viviane Pouthas of the Salpêtrière Hospital in Paris reported activation in a similar set of brain regions when subjects timed intervals that were about 0.5 and 1.5 seconds long. The researchers also observed that a subset of these regions—including certain areas of the cortex and the striatum—showed higher activity when subjects estimated the longer duration. Pouthas says this subset could be playing a direct role in time estimation, distinct from other components of the task such as recalling and comparing intervals.

    How it works: The old and the new

    Although most researchers are now convinced that timing involves multiple regions of the brain, they disagree on how neurons actually keep track of time. Until recently, the prevailing theory had been that some neurons release pulses of one or more neurotransmitters at periodic intervals while other neurons accumulate them, in the same way that a cup placed under a steadily dripping faucet accumulates drops of water. As the receiving neurons register more and more signals, the sense of time that has passed grows. Moreover, quantities of accumulated pulses corresponding to specific durations are recorded in long-term memory, allowing an individual to compare newly encountered time intervals to those previously experienced.

    Spread out.

    Multiple brain regions are activated in a time-estimation task (top); a few of these regions (bottom) show increased activation while estimating longer intervals.

    CREDIT: V. POUTHAS/HOPITAL DE LA SALPETRIERE

    This account of time perception—known as the pacemaker-accumulator model—has held sway since it was proposed in the 1970s by the late John Gibbon, a psychologist at Columbia University. Researchers have found the model to be a handy framework for explaining a fundamental feature of timing, seen in both animals and humans, called the scalar property—which is that the amount of error in estimating time intervals increases linearly with the duration being timed. The model has also provided psychologists with a good handle on a variety of other behavioral findings related to timing.

    But now it is being challenged by some as too simplistic—and perhaps even fundamentally flawed. One challenger is Meck, a protégé of Gibbon and once a strong proponent of the pacemaker model. His group has recently put forth a new idea that has garnered support from many in the field but strong criticism from others.

    Meck spent the 1980s and the early 1990s seeking to identify the neural pieces of the pacemaker-accumulator model. Although this system could in theory operate in a specific brain region, it could also involve multiple regions, as might be expected by the more recently embraced idea of a distributed neural network. Working with Chara Malapani, a clinical psychiatrist at the New York Psychiatric Institute in New York City, and others, Meck proposed in the mid-1990s that the brain's stopwatch was located in the basal ganglia, comprising dopamine-secreting “pacemaker” neurons in the substantia nigra and “accumulator” neurons in the striatum. Some of the evidence for this hypothesis came from studies of Parkinson's disease patients, whose poor performance on timing tasks was found to be linked to the loss of dopamine-producing neurons. Researchers found that medicating these patients with L-DOPA, a drug that increases dopamine levels, improved their timing.

    Even though the dopamine work seemed to put flesh on the pacemaker theory, the model ran into trouble a few years later. At the time, Meck was already somewhat skeptical about the capacity of neurons to linearly sum up temporal pulses over the course of seconds to minutes. Then one of his doctoral students, Matthew Matell, marshaled evidence from the neurobiological literature that convinced Meck that dopamine could not drive neurons in the striatum to fire in the simplistic way proposed by the pacemaker model.

    Meck and Matell have developed an alternative model in which the striatum reads out intervals from a snapshot of activity across a network of cortical neurons. The different neural populations in the cortex—all connected to neurons in the striatum—have firing rates that oscillate at different frequencies. At any given point, the pattern of activity across the cortical network—the synchronous firing by a certain ensemble of neurons—represents the brain's temporal signal, a distributed code that Matell calls the “clock input.”

    When an event has to be timed, the cortical clock is reset through a synchronization of neuronal firing in the network. The neurons in the striatum track the evolving network's activity, until some kind of reinforcement arrives, marking the end of the timed interval: a pellet of food for a rat in a timing experiment, or the change of light for a driver at a traffic intersection. Each reinforcement triggers the substantia nigra to release a wave of dopamine onto striatal neurons, helping to establish a link between their firing and the pattern of activity in the network at that moment. After a number of experiences with a given duration, the striatal neurons start to recognize the snapshot of coincident activity at the moment of reinforcement and are driven to intensify their firing at that moment, indicating that the timed interval is up. This snapshot is filed away in memory for future reference, although Meck and Matell's model doesn't fully explain how this is done.

    The right moment.

    Warren Meck (top) and Matthew Matell (bottom) propose that striatal neurons recognize a learned interval from detecting the pattern of coincident activity across different neural populations. The input from populations that are active at that instant (1, 3, and 5) cause the striatal neurons to step up their firing rate, marking an interval's end.

    CREDITS (TOP TO BOTTOM): COURTESY OF W. MECK; COURTESY OF M. MATELL; K. BUCKHEIT/SCIENCE (INSET)

    “You could imagine the cortical network as a symphony playing a concerto, while the striatum acts as a listener,” says Matell. “When the musical piece comes to a point where the violins and the flutes and the tympani all cooccur—that is, when a particular ensemble pattern of neurons is simultaneously active—the listener decides that a particular duration has elapsed. One could take this analogy further and propose that attention or consciousness serves as the conductor of the symphony.”

    The new model, which Meck and Matell have named the striatal beat frequency (SBF) model, represents a dramatic shift from the pacemaker model because it fundamentally rejects the intuitive notion of time flowing like sand in an hourglass, collecting in a heap as the moments pass. Not only does the new model predict the scalar property of timing, say the two researchers, it is supported by neurophysiological findings. In experiments in which rats were trained to estimate a 40-second duration, Meck, Matell, and a colleague found that the firing rate of striatal neurons peaked at the end of the trained interval, as predicted by the SBF model. Meck and Matell say the recent studies pointing to a distributed picture of timing buttress their theory.

    The SBF model “elegantly allows for the timing of multiple intervals and higher-order temporal structure,” says Dean Buonomano, a cognitive neuroscientist at the University of California, Los Angeles. “But more importantly, it eliminates the need of an accumulator: Counting is not a computation that comes as naturally to neurons as does coincidence detection.”

    Not everybody has been as kind. “Pure fantasy” is how Shadlen describes the model. “Synchronous spikes are ubiquitous in the cortex; there are thousands of neurons firing simultaneously at any given time,” he says, arguing that it's unrealistic to think that such widespread patterns might serve as distinct temporal codes. Shadlen contends that it's more likely that “time is wrapped up inextricably with expectation” and is represented as part of an anticipatory buildup signal (in line with the pacemaker idea) in each of the cortical areas that might be involved in carrying out a task. As evidence, Shadlen points to work that he and a colleague reported in Nature Neuroscience last year in which they recorded such ramplike activity from neurons in the posterior parietal cortices of monkeys that performed eye movements based on time estimates.

    The precise pattern of coincident activity in the cortex changes over time and could thus serve as a duration code, responds Matell. “Using the symphony analogy, there are many musicians playing pretty much throughout a piece of music. And yet, the piece of music is distinguishable at one point in the piece from another point in the piece.” Matell adds that the neurons recorded in Shadlen's monkey experiments could be firing in response to a temporal readout provided by the striatum rather than representing time themselves.

    John Wearden, a psychologist at Keele University in the U.K., levels another charge against the SBF model and similar ideas sometimes grouped as nonclock models. Because different times in the SBF model are represented as different patterns of neuronal activity, he says, there's no way to tell if one interval is longer than another, even though that's something people naturally judge all the time.

    The SBF model permits such comparisons, counters Matell: “If you provided me with two durations, I could time the first duration—let's say that's the longer one—learn its cortical snapshot, and then evaluate whether the second cue finished before my ‘time's up’ for the first or vice versa,” he says.

    Trying to address some of the field's skepticism, Meck and Matell are looking to refine and test their model by recording neurons from multiple sites in rat brains, and by electrophysiological and neuroimaging studies of Parkinson's disease patients. Regardless of whether the SBF model survives, they and many others feel there's little chance of returning to the classic view of a discrete hourglass in a single brain region. As Buonomano puts it, “The notion that timing relies on a centralized single pacemaker-accumulator is on its deathbed.”

  11. ARATA KOCHI PROFILE

    Fighting Words From WHO's New Malaria Chief

    1. John Bohannon*
    1. John Bohannon is a writer in Berlin, Germany.

    Just months into his new job, Arata Kochi is battling big pharma on drug resistance

    A new heavyweight champion has stepped into the ring to fight the global scourge of malaria. Less than 3 months after taking office as director of the World Health Organization's (WHO's) malaria program, Arata Kochi wants the world to know that he's ready to rumble. At a 19 January press conference at WHO headquarters in Geneva, Switzerland, Kochi issued a 3-month ultimatum to the global pharmaceutical industry to stop selling the single-dose form of the drug artemisinin because of the danger of creating resistant strains of the malaria-causing parasite. Kochi threatened to name and shame 18 offending drug companies and said his next step would be to lobby for a “complete ban” of those companies' other malaria medications. “The quiet approach will never work,” Kochi told Science.

    CREDIT: KEVIN WOLF/ASSOCIATED PRESS

    The announcement is a departure for WHO, an international organization that usually relies on consensus before taking action. “We have often been criticized for being slow and ineffective,” says Pascal Ringwald, a medical officer in WHO's Roll Back Malaria program. “But if resistance [to artemisinin] appears tomorrow, the WHO cannot be blamed for saying nothing.”

    First extracted from the common wormwood shrub by Chinese scientists in 1972, artemisinin is the most effective drug today against malaria, with a single dose curing 90% of cases within days. Because resistance to the other malaria drugs is on the rise everywhere, artemisinin is seen as the last defense against a disease that kills 1 million people each year, most of them African children. Initially, scientists thought it unlikely that the parasite could develop resistance to artemisinin because of its mode of action—a peroxide group that releases destructive oxygen atoms. But both the exact mode of action and the possibility of resistance are still in doubt, and experts are alarmed at the recent discovery of a mutation in the parasite that reduces its sensitivity to the drug (Science, 9 December 2005, p. 1607).

    Although no one has yet died of artemisininresistant malaria, says Ringwald, “the warning signs are all there.” To prevent resistance, scientists and WHO officials have been urging governments for the past several years to use artemisinin only in cocktails of multiple drugs called artemisinin-based combination therapy (ACT). “If we lose artemisinin, we lose ACT, and it could be 10 years before a new drug is available, which would be a catastrophe,” says Ringwald.

    Kochi may be a newcomer to the malaria scene, but he's no neophyte to global health. A Japanese public health physician trained at Harvard University, he directed WHO's tuberculosis (TB) programs for 10 years, turning a fledgling two-member staff into one of its flagship programs. “Kochi had a vision” for how to combat TB, says Nils Billo, director of the Paris-based International Union Against Tuberculosis and Lung Disease, “which now most of the countries of the world have adopted and implemented.” Despite his efforts, however, TB remains a major threat—an appeal for a fresh attack on TB was launched last week in Davos, Switzerland.

    With Kochi now focused on malaria, his bold opening move is yielding mixed reviews. “The need to switch from monotherapy to ACT was recognized years ago,” says Brian Greenwood, director of the London School of Hygiene and Tropical Medicine. “But antagonizing big pharma is not a sensible strategy.” Greenwood argues that there is little money to be made developing and selling drugs for a disease that is nearly exclusive to the developing world and that “these companies are really only doing this for good public relations. We need their help if we're going to get medicines into poor communities.” An official at one of the biggest companies on Kochi's list, Paris-based Sanofi-Aventis, told Science they plan to comply but added: “It is the responsibility of local authorities to implement the switch to ACTs, which is more complex and requires education.”

    Others involved with global health praise Kochi for “taking a stand and saying something that we've all been thinking,” says Chris Hentschel, CEO of the Geneva-based Medicines for Malaria Venture. “We're behind him.” Hentschel says that not all companies are making malaria medicines “just for charity” and that in some cases “they have been unhelpful.”

    Right or wrong, Kochi faces an uphill battle. “WHO has no powers to enforce and a very small budget,” says Hentschel, “so the most it can do is damage a company's reputation.” He predicts that smaller companies may ignore the ultimatum “because they feel they don't have any reputation to lose.” One way WHO could make an impact, he says, would be to influence the decisions of big drug purchasers such as the U.N.'s children's fund UNICEF or the Global Alliance for Vaccines and Immunization.

    For his part, Kochi seems confident. “When I named countries that weren't doing enough to fight TB in 1996, they responded and improved,” he says. As proof that his strategy is sound, he notes that two companies on his list of 18—Switzerland-based Mepha and “a generic drug company”—have already promised to comply.

    Kochi says his next targets are “gaps” in malaria research. “Malaria epidemiology is very weak,” he says, “and we also need more consensus on how to diagnose the disease.” Without “better science,” Kochi says, strategies to combat malaria “will continue to be like religion, based on faith.”

  12. BIOMEDICAL ENGINEERING

    Spending Itself Out of Existence, Whitaker Brings a Field to Life

    1. David Grimm

    The Whitaker Foundation took on the job of turning a fledgling field into a scientific heavyweight—and succeeded. But what happens to biomedical engineering now?

    Bioengineer Sangeeta Bhatia and the Whitaker Foundation are scientific soul mates. As a graduate student in a joint M.D.-Ph.D. program at the Massachusetts Institute of Technology (MIT) and Harvard University in the 1990s, Bhatia attended the Whitaker College of Health Sciences and Technology, an interdisciplinary program for scientists and engineers in the Boston area that the foundation began funding in 1979. After graduation, Bhatia received a Whitaker young investigator's grant to set up her lab at the University of California, San Diego (UCSD), a school that has received $23 million from the foundation to build up its biomedical engineering department.

    “The Whitaker award allowed me to get my very first piece of equipment and hire a graduate student,” says Bhatia, now an associate professor at MIT. “It helped me launch my entire research program.”

    There and back again.

    Thanks to Whitaker funding, Sangeeta Bhatia has returned to MIT as an associate professor.

    CREDIT: MARIO CASAL/HST

    With grants from the National Institutes of Health (NIH) and the National Science Foundation (NSF), Bhatia appears well along the road to a successful academic career. And that's fortunate, as Whitaker is on the road to oblivion. In June, the foundation will shut its doors after 30 years and more than $800 million in scientific philanthropy. During its lifetime, the Arlington, Virginia-based foundation has invested in thousands of young faculty members and graduate students and built hundreds of laboratories. That investment has transformed biomedical engineering from a barely recognized discipline into one of the most popular science majors in the United States.

    “Their impact is almost immeasurable,” says Frank Yin, president of the Biomedical Engineering Society and chair of the biomedical engineering department at Washington University in St. Louis, Missouri. “Whitaker put biomedical engineering on the map.”

    As remarkable as its largess is the way Whitaker spent it. “Most foundations focus on a problem—such as world hunger—and anyone who has a tool to address this problem qualifies for funding,” says Thomas Skalak, chair of the biomedical engineering department at the University of Virginia (UVA) in Charlottesville. “Whitaker was unique in that it tried to establish the permanence of a particular field. And it knew that to do this, it would need to build up the field's infrastructure.” That thrust benefited young and promising scientists such as Bhatia, whom the foundation regarded as the future of the discipline.

    But will biomedical engineering continue to thrive when Whitaker leaves the scene? Some fear programs that have just begun to blossom under Whitaker's care may wilt. Others are concerned that some young faculty members could be orphaned without Whitaker's support, stunting the entire field. And still others worry that the foundation may have overbuilt the field's academic structure, creating more departments than the discipline can maintain. “The changes over the next few years could be pretty dramatic,” Yin predicts.

    Going for broke

    The Whitaker Foundation was never supposed to last forever. Its founder, U. A. Whitaker—an engineer and CEO of a company that manufactured electronic connectors—hoped the foundation would fold within 40 years of his death in 1975. “He hated bureaucracy,” says Whitaker President Peter Katona. “He felt that a foundation wouldn't accomplish its mission if it went after that mission forever.”

    Despite those concerns, Whitaker operated much like any other charitable organization in its early years. It spent about 10% of its capital annually (about $14 million) fostering collaborations between biologists and engineers to develop medical devices. Most of the money was channeled into 3-to-4-year grants to new faculty members such as Bhatia.

    Like most biomedical engineers, Bhatia avoided a lengthy postdoc and moved directly into an assistant professorship. But there was a tradeoff: The quick transition prevented her from gathering enough research data to feel comfortable submitting a grant proposal to NIH. Another handicap was the project itself: designing a cartridge filled with liver cells that would help filter blood in patients with kidney failure. Its large engineering component, she felt, meant it “would never fly at the NIH.” But once Bhatia had received the Whitaker award, she had the wherewithal to pitch a successful application first to NIH and later to NSF.

    Although Whitaker's board was happy with the return on modest investments such as this, in 1991 it decided to go for broke. “The governing board wanted to increase the impact of the foundation when the field was at the cusp of becoming mainstream,” says Katona. “They knew the only way to do this was to spend big bucks.”

    Ironically, exhausting the $200 million endowment proved harder than expected. Over the next 4 years, the foundation's assets more than doubled, thanks to the stock market boom. By the end of the decade, its annual payout of more than $60 million matched that of charities with an endowment of $1.2 billion.

    One goal was to create thriving bioengineering departments at top U.S. universities. That's what happened at UCSD, whose program in 1988 consisted of six faculty members on half a floor of a medical school building. In 1993, Whitaker awarded the university $5 million to hire more professors and develop core facilities. Four years later, the foundation gave UCSD $18 million for a bioengineering building.

    “The building brought together faculty and staff for the first time and really transformed the university into a national powerhouse for bioengineering,” says Bhatia, who was hired under the first award. Today, UCSD has an official bioengineering department with 18 faculty members, 150 graduate students, and 1000 undergraduate majors—triple the pre-Whitaker numbers. U.S. News & World Report ranks the department second in the nation; before Whitaker, it wasn't even on the radar.

    Dozens of schools can tell similar stories. Since 1991, the number of biomedical engineering departments in the United States has soared from 27 to 74, with accompanying increases in the number of undergraduates and graduate students.

    Whitaker's smaller awards have made a big difference as well, says Skalak. At UVA, two $1 million awards allowed his department to create new biomedical engineering courses and establish dedicated lecture halls. Bhatia used similar awards at UCSD to write the first undergraduate textbook on tissue engineering and to help develop a Web site that allows all UC students to take biomedical engineering classes online. “Whitaker realized that these tools would help grow biomedical engineering as a discipline and were vital to the field's future,” she says.

    Engineering change

    With Whitaker folding its tent, biomedical engineers are wondering if the field can continue to thrive. “There's a lot of trepidation in the field about what will happen now,” says Bhatia. Thanks to the foundation's emphasis on infrastructure, the number of biomedical engineers continues to increase, even as funding remains static. But the field will also need to cope with the decline in start-up funding for new biomedical engineering faculty members, a $275 million program that over the past 15 years has given out 80 to 100 awards annually. “The loss of these awards is going to make it difficult for new professors,” admits Katona, who says he'd much rather be entering the field 3 to 4 years ago than today.

    The Miami, Florida-based Wallace H. Coulter Foundation offers early career awards in bioengineering, but they provide less money and last only 2 years. A better candidate to fill Whitaker's shoes is the 5-year-old, $300-million-a-year National Institute of Biomedical Imaging and Bioengineering (NIBIB). Aware of the needs of new investigators, NIBIB gives their grant applications a 5% bonus in merit reviews. “The idea is to cut new professors extra slack so they'll have an easier time getting funded,” says Deputy Director Belinda Seto.

    Bioengineering boom.

    The number of U.S. biomedical engineering departments has skyrocketed thanks to Whitaker's sustained investment.

    Big winners. With $18 million for its bioengineering building (inset), the University of California, San Diego, leads the list of Whitaker institutional award winners.

    CREDITS: UNIVERSITY OF CALIFORNIA, SAN DIEGO (INSET); SOURCE: WHITAKER FOUNDATION

    Thanks to the policy, Seto says seven additional biomedical engineering faculty members qualified for basic R01 grants last year, bringing the total to 24. In addition, she says young investigators can apply for exploratory R21 grants, which don't require preliminary research data and confer an average of $350,000 for 2 years. In 2005, 39 of these awards went to new faculty members.

    Robert Nerem, director of the Parker H. Petit Institute for Bioengineering and Bioscience at the Georgia Institute of Technology in Atlanta, worries that such policies won't be enough. The R21s are too short, he says, and both he and Yin say that most of NIBIB's funding goes toward clinical imaging research rather than biomedical engineering. In addition, Nerem notes that NIBIB has one of the lowest applicant success rates at NIH. “So far, NIBIB has not stepped up to the plate,” he says.

    Another concern is that Whitaker may have overbuilt the field's academic structure, says Nerem. “Was building 60, 70, 80 departments really the right strategy?” he asks. The number of biomedical engineers graduating from many smaller schools may contribute to an oversupply in the short term, he says. That could eventually lead to a Darwinian crunch that hurts the smaller departments.

    As the chair of one such department, Michael Neuman of Michigan Technological University in Houghton admits “Whitaker took a bit of a gamble with us” because of the school's isolated location on Michigan's upper peninsula, the lack of a nearby medical school, and its small life sciences program. To survive, Neuman says that programs such as his may have to change their focus from research to teaching, qualifying them for a larger university allocation based on class enrollments.

    Yin expects other changes as well. As fewer faculty positions open up, graduates may find themselves doing longer postdocs, and more biomedical engineers may begin moving into new areas. Neuman has begun to see some of this already at Michigan Tech. A recent graduate joined the automotive industry and is studying the biomechanics of car accidents, he says. And a former student of Neuman's is a child abuse counselor who uses her education to assess, for example, whether a child really got his injuries from falling down a flight of stairs.

    Although Whitaker may not have foreseen these changes, Katona, the foundation's president, is happy with the community's response. “I like the idea that some universities will do things differently and that not everyone is taking the same career path,” he says. “We've done our job. … Now it's up to the field to have a midcourse correction if necessary.”

    Despite the challenges, Bhatia is optimistic that the interdisciplinary approach that permeates biomedical engineering and the growing demand for new medical technology will help sustain the field. Her new project uses nanoparticles to target drugs to tumors—the precise mixture of biology and engineering that Whitaker has tried so hard to foster. “Whitaker got us to this point by taking a risk,” says Bhatia. “Now we must evolve without them.”

  13. AMERICAN ASTRONOMICAL SOCIETY MEETING

    Laser Points to Bright New Era for Ground-Based Astronomy

    1. Robert Irion

    AMERICAN ASTRONOMICAL SOCIETY MEETING, 8–12 JANUARY, WASHINGTON, D.C.

    Many dark nights at the W. M. Keck Observatory atop Mauna Kea, Hawaii, now feature a startling source of light: a laser beam emerging from one of the twin domes. Rather than swamping faint signals from the heavens, however, the photons have quite the opposite effect.

    Starring role.

    A new laser system sharpens the Keck II Telescope's view of faint objects.

    CREDIT: SARAH ANDERSON/W. M. KECK OBSERVATORY

    The long-awaited advance, called laser guide star adaptive optics, trumps the standard adaptive optics now used at most major telescopes. Light from a star jitters rapidly as it passes through Earth's shifting atmosphere. In the current systems, computers analyze that pattern, then flex a thin mirror within the telescope's optical path to correct the distortion. The result is steady vision rivaling that of the Hubble Space Telescope. But this technology requires light from a bright star or planet, limiting its application to about 1% of the sky.

    In contrast, astronomers can aim the Keck laser nearly anywhere. The beam illuminates a layer of sodium atoms about 90 kilometers high, left there by incoming meteoroids. The flexible mirror sharpens the telescope's vision of this “star”—and, along with it, faint objects in the adjacent field of view. After years of tinkering, this system became available last year for routine scientific use on the 10-meter Keck II Telescope.

    The first results delighted a packed session of observers at the meeting. “It's very clear that it has moved beyond an experimental development to mature science observations,” says R. Mark Wagner, an instrument scientist for the Large Binocular Telescope at Mount Graham, Arizona. “They've made laser adaptive optics into a turnkey system,” adds Jeremy Mould, director of the National Optical Astronomy Observatory in Tucson, Arizona. “You just use it to do astronomy, and you don't worry about whether it's going to work.”

    For example, astronomer Michael Liu of the University of Hawaii, Manoa, used laser-honed vision to study close pairs of the failed stars called brown dwarfs. Liu's team resolved one binary pair with distinct reddish and bluish colors, even though the dwarfs probably formed together in the same gaseous nursery. Liu believes the difference arises from the ebb and flow of iron-rich clouds in their gradually cooling atmospheres. “Binaries like this will be our Rosetta stones for understanding how clouds vanish at this critical transition,” comments modeler Mark Marley of NASA's Ames Research Center in Mountain View, California.

    In another laser-guided project, astronomer Judith Cohen of the California Institute of Technology in Pasadena short-circuited a dispute about tight clusters of stars orbiting the nearby Andromeda galaxy. Some researchers had claimed that a few of Andromeda's stellar clusters are as young as 1 billion years, even though all such clusters in our Milky Way are at least 9 billion years old. But with Keck's newly piercing view, Cohen found that most of the supposed clusters were much looser groupings of stars or even flaws in previous images.

    Other targets have included satellites of bodies in our solar system's remote Kuiper belt; stars on swooping orbits around the black hole at the center of our galaxy; and the physical properties of galaxies where supernovae exploded more than 8 billion years ago. “These are the big leaps we were looking for,” says Keck Director Frederic Chaffee.

    Chaffee says Keck went to school on a prototype laser system at the University of California's Lick Observatory near San Jose. Several 8-meter telescopes are now catching up with planned laser-guided science runs later this year, including the U.S.-led Gemini North and Japan's Subaru Telescope at Mauna Kea, and the European Southern Observatory's Very Large Telescope at Cerro Paranal, Chile.

  14. AMERICAN ASTRONOMICAL SOCIETY MEETING

    Pulsar Sets a Dizzying Standard

    1. Robert Irion

    AMERICAN ASTRONOMICAL SOCIETY MEETING, 8–12 JANUARY, WASHINGTON, D.C.

    Astronomers have broken a long-standing record by finding the fastest spinning object in space. The new champ is a neutron star—the ultradense remnant of a supernova explosion—rotating 716 times each second. If the star is about 20 kilometers wide, as assumed, its equator would whirl at 15% the speed of light.

    The object is among dozens of newfound millisecond pulsars, so named for their clocklike pulses of radiation at hundreds of hertz (cycles per second). Astronomers led by Scott Ransom of the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia, detected the pulsars in several of our galaxy's rich clusters of stars. Old pulsars in these tightly packed swarms get resuscitated when they interact with other stars and acquire binary partners. When such a companion star evolves into a bloated giant, intense gravity pulls its gas toward the pulsar. The infalling matter spirals onto the pulsar and whips up its spin. The previous standard-bearer of 642 hertz was the first millisecond pulsar found, in 1982. The long wait made some theorists doubt that the spin-up process could go much faster.

    The new find explains part of the delay: The speediest millisecond pulsars appear screened by matter blasted off their companions, making them tough to spot. Graduate student Jason Hessels of McGill University in Montreal, Canada, and others worked with Ransom to tease out the pulsar's 716-hertz flashes, which vanish 40% of the time. An ongoing search of star clusters with a sensitive radio processor at NRAO's 100-meter Green Bank Telescope in West Virginia should unveil ever-faster spinners, Ransom says. The group reported its discovery at the meeting and in the 12 January online edition of Science (www.sciencemag.org/cgi/content/abstract/1123430).

    In principle, pulsars could accelerate to 1500 to 2000 hertz before shattering from centrifugal force, says astrophysicist Deepto Chakrabarty of the Massachusetts Institute of Technology in Cambridge. “But it's quite striking that they are nowhere near that,” he says. The new pulsar may have surpassed 1000 hertz at its peak. But Chakrabarty believes some physical mechanism—perhaps shedding of energy by gravitational waves—applied the brakes to the pulsar and its cousins soon after their rebirth.

  15. AMERICAN ASTRONOMICAL SOCIETY MEETING

    Pesky Companions Warp the Milky Way

    1. Robert Irion

    AMERICAN ASTRONOMICAL SOCIETY MEETING, 8–12 JANUARY, WASHINGTON, D.C.

    Our galaxy wins no prize for symmetry. Its disk of gas and stars bends upward and downward, like the brim of a trampled hat. Astronomers have long suspected that dwarf galaxies orbiting the Milky Way perturb the disk, especially the Magellanic Clouds—two smudges of stars visible from the Southern Hemisphere. At the meeting, researchers laid out fresh details of how that warping might occur. “It's a careful explanation for what's going on in the outer galaxy,” comments astrophysicist David Spergel of Princeton University in New Jersey.

    Warped.

    The Magellanic Clouds, both large and small (inset), bend our galaxy's disk of gas and stars.

    CREDIT: C. SMITH/S. POINTS/MCELS TEAM/NOAO/AURA/NSF; F. WINKLER, MIDDLEBURY COLLEGE/MCELS TEAM/NOAO/AURA/NSF (INSET)

    Important clues came from detailed radio images of neutral hydrogen gas in the Milky Way's disk, assembled by scientists in the Netherlands, Argentina, and Germany. Astronomer Leo Blitz of the University of California, Berkeley, and colleagues used the data to chart the asymmetric ebbs and flows of hydrogen above and below the disk. The team found that it could describe the warped shape as a mathematical superposition of three simple modes of vibration. Effectively, the Milky Way behaves like a vast cymbal anchored at its center.

    Then, Blitz and dynamicist Martin Weinberg of the University of Massachusetts, Amherst, constructed a model of how the Magellanic Clouds might excite those vibrational modes. Although the two galaxies contain perhaps 2% of the Milky Way's mass, they exert an outsized influence thanks to one factor: dark matter. The dwarfs loop around our galaxy on lazy orbits lasting 1.5 billion years, slogging through the Milky Way's extended halo of hidden dark matter. Those motions raise persistent gravitational “wakes” that tug on the disk, Blitz says. The forces resonate strongly around the disk's edges, where gas is most loosely bound.

    When Weinberg and Blitz ran their model, they were surprised to see the disk's outermost portions constantly flapping as the Magellanic Clouds trundled along. That impressed astronomer Linda Sparke of the University of Wisconsin, Madison, who notes that at least half of all galaxies have distorted disks. “This will help convince people that there are ways to get warps to live a long time,” Sparke says. Thorough studies of the Milky Way's warp should let astronomers trace the extent and impacts of the galaxy's shroud of dark matter, she adds.

    Astronomer James Binney of Oxford University in the U.K. agrees that “junk” falling onto our galaxy creates lumpy irregularities in its halo and makes the disk quiver. But he thinks it's hasty to finger the Magellanic Clouds exclusively. In particular, a smaller and closer galaxy called the Sagittarius dwarf is now plunging through the Milky Way's disk, so its punch may contribute as well. “Both the dynamics and the astronomy are a bit of a mess,” Binney says. “I don't think this story will close off any time soon.”

  16. AMERICAN ASTRONOMICAL SOCIETY MEETING

    Snapshots From the Meeting

    1. Robert Irion

    AMERICAN ASTRONOMICAL SOCIETY MEETING, 8–12 JANUARY, WASHINGTON, D.C.

    Bulging waist. Vega, a bright star in the northern sky, barely holds itself together. The combined light from six widely spaced 1-meter telescopes at Mount Wilson, California, resolved fine details on Vega's surface. Interferometry patterns showed that gas at Vega's equator is a whopping 2300 kelvin cooler than at its pole, caused by the star's grossly distorted shape as it rotates once every 12.5 hours. Modeling led by astronomer Jason Aufdenberg of the National Optical Astronomy Observatory in Tucson, Arizona, suggests that Vega would break apart if it spun only 9% faster.

    CREDIT: NASA/ESA/A. SCHALLER (FOR STSCI)

    Doomed giants. Infrared telescopes have exposed the heftiest group of the biggest stars. Of about 200 known red supergiants in our Milky Way, 14 reside in a tight cluster previously hidden behind dust toward the galaxy's center (artist's conception, above). Each unstable supergiant is roughly 1000 times as large as our sun. On average, one star should explode in a supernova every 20,000 to 60,000 years, says astronomer Donald Figer of the Rochester Institute of Technology in New York. Recent blasts in the cluster explain a distinct buzz of gamma rays and radio waves from that part of the sky.

    Nearly perfect. Those who enjoy geometric beauty in nature will gravitate to PSR J1909-3744, a rapidly spinning pulsar 3700 light-years away. For nearly 2 years, astronomer Bryan Jacoby of the Naval Research Laboratory in Washington, D.C., and colleagues clocked the arrival times of about 19 billion of the pulsar's blips. The accurate timing revealed that the pulsar's orbit, around a tiny companion star, tracks the most circular path yet seen in space. The orbit spans more than 1 million kilometers, but its major axis is just 11 micrometers wider than its minor axis.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution


Navigate This Article