News this Week

Science  16 Jan 2009:
Vol. 323, Issue 5912, pp. 318

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Science Groups Emphasize Jobs in Push for More Research Funding

    1. Jeffrey Mervis
    Stimulating ideas.

    Representative Rosa DeLauro with Norman Augustine at last week's Democratic forum; behind them, House Speaker Nancy Pelosi confers with Reps. Bart Gordon and Rush Holt.


    As the United States struggles with its worst economic crisis since the Great Depression, lobbyists for basic research are offering one more reason the federal government should boost its investment in science: Science means jobs.

    Specifically, they want Congress to spend billions on a long list of existing research projects and programs at several federal agencies as part of a massive economic recovery plan legislators are cobbling together this month. To their surprise, that message has received a warm reception from President-elect Barack Obama, his aides, and the Democratic congressional leaders who are shaping a plan that could cost more than $800 billion over 2 years.

    The American Physical Society (APS), for example, is circulating a $3.5 billion wish list that covers research and training efforts funded by the U.S. Department of Energy (DOE), the National Science Foundation (NSF), and the National Institute of Standards and Technology (NIST). The list proposes upgrades and new activities at DOE's national laboratories, investments in a range of renewable energy technologies, and bump-ups in competitive grants programs at all three agencies. The society's initial list, drawn up right after the November elections, totaled $1.5 billion; it was revised after Obama transition team officials “told us we should think bigger,” says APS's Michael Lubell.

    Some of the projects are also listed in a $2.7 billion proposal from the Association of American Universities, which includes $1.8 billion for NSF and the U.S. National Institutes of Health (NIH) to help scientists setting out to become independent investigators. A similar compilation of projects from NASULGC, an association of public and land-grant universities, also calls for an additional $1.2 billion in NIH's bread-and-butter investigator grants and $200 million for competitive research at the U.S. Department of Agriculture.

    Maria Zuber, a geophysicist at the Massachusetts Institute of Technology (MIT) in Cambridge, says House Speaker Nancy Pelosi (D-CA) delivered the same message to a panel of academic and industry leaders who met with legislators at a December roundtable in Princeton, New Jersey, to discuss the role of science in reviving the economy. “In my wildest dreams, I thought that the best we could hope for would be full funding of the America COMPETES Act [2007 legislation that authorizes a 7-year budget doubling for NSF, DOE's Office of Science, and NIST],” says Zuber, a member of the panel, which was arranged by Representative Rush Holt (D-NJ), a former physicist. “But when we got there, the speaker told us, ‘The COMPETES Act is good. But ask for all of it now, and think bigger.’”

    Zuber, who reprised her role last week at a House Democratic leadership forum on Capitol Hill that Pelosi convened, doesn't expect everything on the societies' lists to be incorporated into bills being prepared separately by the House and Senate. But she also doesn't think lawmakers will turn a cold shoulder to the requests. One proposal before Senate appropriators would give DOE's Office of Science as much as $500 million in construction funds, to be spent at the agency's discretion.

    Advocates say the new focus on the literal nuts and bolts of modern science will not only put people back to work quickly but also make it less likely those jobs will disappear in a few years by keeping U.S. companies competitive in a global economy. With unemployment topping 7%, the highest rate in 16 years, that's an easy sell to politicians. In an 8 January speech billed as his first major address since his election, for example, Obama said that repairing the nation's tattered economy will require “investing in the science, research, and technology that will lead to new medical breakthroughs, new discoveries, and entire new industries” in addition to the traditional emphasis on “shovel-ready” construction projects. Extending his earlier commitment to renovate and connect up the nation's elementary and secondary schools, he also proposed “equipping tens of thousands of schools, community colleges, and public universities with 21st century classrooms, labs, and libraries.”

    That latter promise is music to the ears of academic scientists and administrators. Zuber, head of MIT's department of earth, atmospheric, and planetary sciences, says a new junior faculty member receives a $1 million start-up package from the university for buying equipment, renovating existing space, and doing whatever else it takes to launch an experimental program. After that, the fledgling scientist generally must fly solo.

    Even if the faculty member wins a federal research grant, chances are it won't include money for lab upgrades. Most federal agencies award less than scientists request, and most scientists choose people over instrumentation when deciding what to eliminate. “NASA always says that things will be better in the out years, but it never happens,” says Zuber, who is principal investigator for GRAIL, an upcoming NASA mission to measure the moon's gravitational field. She says that doubling NSF's major research instrumentation program, which now receives triple the number of good proposals that its $93 million budget can accommodate, “would be a really helpful” way to ease that crunch.

    Pelosi said as much at last week's forum, when she noted Zuber's presence on a panel along with three eminent economists and Norman Augustine, the former CEO of aerospace giant Lockheed Martin, who chaired a 2005 National Academies' report calling for a $20 billion federal investment in research and education to strengthen U.S. innovation. “Having a scientist here emphasizes that [the recovery plan] must be about the future,” Pelosi stated. “That's nothing new. … It's about innovation, which begins in the classroom and in the lab.”


    Current System Hampers U.S. Innovation, Says Panel

    1. Eli Kintisch

    In a world in which economic and national security hinges on a country's technological and scientific prowess, the United States gains more than it loses from sharing information. That counterintuitive message comes from a U.S. National Academies' panel that heaped scorn on current rules that control access to sensitive technical information by non-U.S. citizens as well as the release or export of certain items.

    The problem, the panel says in a report issued last week, is that the restrictions are bad for business, security, and science. The solution is to put restrictions on fewer technologies and tighten up on those technologies, such as nuclear weapons or chips to guide missiles, that are vital to U.S. security. “What we want is an open system,” says study co-author Brent Scowcroft. “The [better] premise is that everything is open except those things you can justify [a] need to be [restricted].”

    The study calls for President-elect Barack Obama to set up new White House bodies to simplify how scientists and companies obtain export licenses and resolve disputes between agencies that delay decisions. It says that export controls on specific items or technologies should last no longer than 1 year unless the government can recertify them. The report also calls for streamlining the process for providing visas for foreigners working in scientific areas. (The study, initiated by the academies, didn't deal with the issue of which information should be labeled as classified.)

    Maintained by the State Department, Commerce Department, and the Pentagon, export-control rules determine what products and components can be exported, which concepts can be discussed in open scientific papers, and which subjects are acceptable to discuss with foreign nationals. In the past 2 years, the three agencies have set new time limits to reduce delays in issuing export licenses, removed items that pose no risk from protected lists, and eased restrictions on technical information or products that are readily available on the Internet or overseas. But report co-author John Hennessy, president of Stanford University in Palo Alto, California, said such moves are just “incremental improvements.”

    Killing the MESSENGER.

    Rules regulating satellite technology almost knocked out a plasma sensor on a 2004 NASA probe to Mercury.


    Space scientist Thomas Zurbuchen of the University of Michigan, Ann Arbor, says the current rules are “an obstacle to universities in the United States … from being the leaders in space research.” The Swiss-born researcher was the lead scientist on a project to build a plasma sensor for a Mercury-bound NASA spacecraft called MESSENGER that was launched in 2004. But in 1999, Congress tightened export-control laws, and project managers restricted Zurbuchen's access to design documents he had created. Fortunately, Zurbuchen soon received a green card that exempted him from these restrictions. “We got lucky,” he says.

    One member of the new Administration likely to be sympathetic to such changes is Defense Secretary Robert Gates, former director of central intelligence and a holdover from the Bush Administration. Gates was a member of the academies' panel before he joined the Cabinet in December 2006. In an e-mailed statement, his spokesperson said the report's recommendations “would have a significant, positive impact” on defense science, though it didn't specify how.

    Congress has resisted previous calls for easing export controls, fearing that greater sharing would aid the country's enemies. Supporters worry that any attempt to implement the panel's recommendations could backfire: Legislators could instead pass even more restrictive laws.


    Can an Ecologist Push NOAA Up the Food Chain?

    1. Erik Stokstad

    Marine ecologist Jane Lubchenco launched her career studying the complexity of the intertidal zone—who eats whom, for example, and how ecosystems are affected by internal dynamics and external forces. Her colleagues expect that intellectual background to serve her well in the turbulent waters of U.S. federal politics if, as expected, she's confirmed as head of the National Oceanic and Atmospheric Administration (NOAA), part of the Department of Commerce.

    Nominated last month by President-elect Barack Obama, Lubchenco, 61, would assume the reins of a $3.9 billion agency squeezed by cost overruns from a problematic satellite system and beset with aging research infrastructure. Her political bosses are also likely to ask it to play a bigger role in studying and preparing for climate change. “She's got a big job ahead of her,” says Marcia McNutt, head of the Monterey Bay Aquarium Research Institute in Moss Landing, California. Although Lubchenco has less management experience than recent administrators, McNutt and other supporters say her tenacity and her experience in teaching scientists how to talk to policymakers augur well for her chances of success.

    Lubchenco, a faculty member for 30 years at Oregon State University, Corvallis, has branched out beyond basic research into issues of ocean health and sustainable management. A turning point, says Pamela Matson of Stanford University in Palo Alto, California, was her participation in a strategic planning exercise for the Ecological Society of America (ESA). The effort led to a 1991 “Sustainable Biosphere Initiative” that identified global climate change as a priority research topic for ecologists and called for a focus on improving management of Earth's life support systems. “After that, she became much more engaged in scientific leadership and communication and more focused on the use of science in decision-making,” Matson says.

    During her leadership of ESA in the early 1990s, Lubchenco helped move the society's offices to Washington, D.C., and set up a policy staff. “She's not a caretaker leader,” says Matson. Lubchenco continued to advocate for injecting more science into policymaking when she was president, then chair of the board, of AAAS (which publishes Science) from 1996 to 1998.

    Lubchenco is a talented dealmaker as well, notes Andrew Rosenberg of the University of New Hampshire, Durham. While serving on the privately funded Pew Oceans Commission, Lubchenco helped broker a merger in 2005 with the publicly funded U.S. Commission on Ocean Policy. “I don't think it was a foregone conclusion that they would work together,” says McNutt.

    Lubchenco also has a knack for recognizing a need and filling it. In 1993, she founded the Aldo Leopold Leadership Program, which trains scientists to communicate with policymakers and others, and 6 years later she began a similar organization, the Communication Partnership for Science and the Sea, to help publicize policy-relevant findings. Lubchenco's polished communication skills will help her build support for NOAA in Congress and among stakeholders, predicts David Fluharty of the University of Washington, Seattle, who chairs NOAA's Science Advisory Board. “This is where Jane is going to be very strong.”

    Speaking up.

    Jane Lubchenco has spent her career explaining science to others, including policymakers.


    That strength will be needed in running an agency whose responsibilities span basic oceanographic and atmospheric research, the National Weather Service (NWS), fisheries management, and hurricane forecasting. The $14 billion National Polar-orbiting Operational Environmental Satellite System, for which NOAA is the lead agency with NASA and the Department of Defense, is behind schedule, over budget, and facing technical challenges. In addition, cost overruns in preparing for the 2010 census, managed by the Commerce Department, could affect NOAA and other departmental activities. Meanwhile, more and more research vessels and other facilities need to be upgraded or replaced. “There are very tough choices,” says Conrad Lautenbacher, her predecessor, who resigned in October.

    A likely priority is improving NOAA's ability to forecast regional climate change impacts, a cause that Lubchenco has endorsed. (Lubchenco declined to be interviewed, as is customary before nominees are confirmed by the Senate.) NOAA has recently begun considering how to create a National Climate Service (NCS), which would be analogous to NWS (Science, 3 October 2008, p. 31).

    NCS would bring together NOAA activities now scattered across various branches. NOAA spends roughly $300 million on climate research and outreach, but Richard Spinrad, head of NOAA's $398-million-a-year Office of Oceanic and Atmospheric Research, says NCS will need a budget no smaller than the $900 million that NWS receives. “The diversity of products and services is apt to be broader for an NCS,” he says.

    Lubchenco will also have to scramble to fund what she sees as needed changes in fisheries management, including setting up marine reserves. A decade ago, Lubchenco proposed putting 20% of the ocean in reserves by 2020, 40 times the current percentage. “I would expect her to be much more cautious at setting goals as an administrator,” Fluharty says.

    Those who know Lubchenco say her approach to understanding is relevant to managing NOAA: Gather lots of data, think deeply, figure out what needs to be done, and then be an advocate for action. “I think she will harness and energize NOAA's considerable resources to bring about enormous change for the good,” says Nancy Baron, COMPASS Ocean Science Outreach Director. “It's an amazing opportunity.”


    A Human Trigger for the Great Quake of Sichuan?

    1. Richard A. Kerr,
    2. Richard Stone

    Natural disasters are often described as “acts of God,” but within days of last May's devastating earthquake in China's Sichuan Province, seismologists in and out of China were quietly wondering whether humans might have had a hand in it. Now, the first researchers have gone public with evidence that stresses from water piled behind the new Zipingpu Dam may have triggered the failure of the nearby fault, a failure that went on to rupture almost 300 kilometers of fault and kill some 80,000 people.

    Still, no one is near to proving that the Wenchuan quake was a case of reservoir-triggered seismicity. “There's no question triggered earthquakes happen,” says seismologist Leonardo Seeber of the Lamont-Doherty Earth Observatory in Palisades, New York. That fact and the new evidence argue that the quake-dam connection “is worth pursuing further,” he says, but proving triggering “is not easy.” And the Chinese government is tightly holding key data.

    Seismologists have been collecting examples of triggered seismicity for 40 years. “The surprising thing to me is that you need very little mechanical disturbance to trigger an earthquake,” says Seeber. Removing fluid or rock from the crust, as in oil production or coal mining, could do it. So might injecting fluid to store wastes or sequester carbon dioxide, or adding the weight of 100 meters or so of water behind a dam.

    Whatever the nature of the disturbance, it must bring a nearby fault to the point of failure to trigger a quake. In the case of reservoir-triggered seismicity, the water's weight can weaken a fault by counteracting the stresses that are squeezing the two sides of the fault together and tightly locking it. Or, the added weight can increase the stress already tending to push opposing sides past each other and break the fault. In 1967, impoundment behind the Koyna Dam in India triggered the largest known reservoir-triggered quake, a magnitude-6.3 temblor that killed 200 people. Seismologists recognize dozens of other reservoir-triggered quakes in the range of magnitude 3 to 6.

    So when the magnitude-7.9 Wenchuan earthquake struck, many scientists wondered if a reservoir was to blame. Ruling out the much-maligned Three Gorges Dam as too distant, experts considered the Zipingpu Dam, just 500 meters from the fault that failed and 5.5 kilometers from the quake's epicenter. The timing was right. The Zipingpu reservoir began filling in December 2004, and within 2 years the water level had rapidly risen by 120 meters, says Fan Xiao, a chief engineer of the Sichuan Geology and Mineral Bureau in Chengdu.


    The several hundred million tons of water behind Sichuan's Zipingpu Dam may have triggered a devastating earthquake.


    The several hundred million tons of water piled behind the Zipingpu Dam put just the wrong stresses on the adjacent Beichuan fault, geophysical hazards researcher Christian Klose of Columbia University said at a session last month at the Fall Meeting of the American Geophysical Union in San Francisco, California. In his talk, Klose coyly explained—without ever mentioning a dam—how the added water changed the stresses on the fault. According to his calculations, the added weight both eased the squeeze on the fault, weakening it, and increased the stress tending to rupture the fault. The effect was 25 times that of a year's worth of natural stress loading from tectonic motions, Klose said. When the fault did finally rupture, it moved just the way the reservoir loading had encouraged it to, he noted.

    Klose's listeners were intrigued but far from convinced. They wanted to hear more details about changing water levels and local, lower-level seismicity. Fan, who was not at the meeting, provides some of those details, all of which favor a link between the Zipingpu Reservoir and the earthquake. Judging by the history of known reservoir-triggered quakes, the rapid filling of Zipingpu as well as its considerable depth would have favored triggering, he says. The delay between filling and the great quake would have given time for reservoir water to penetrate deep into the crust, where it can weaken a fault. And the greatest danger of triggering comes not at the time of maximum filling, he argues, but when the water level is falling. “As we now know, a week before the May 12 earthquake, the water level fell more rapidly than ever before,” says Fan.

    A paper in last month's issue of the Chinese journal Geology and Seismology arrives at a similar conclusion. Zipingpu's impoundment “clearly affected local seismicity,” says lead author Lei Xinglin, a geophysicist at the China Earthquake Administration in Beijing and the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan. Lei emphasizes that a firm conclusion is premature, but he sees penetration of reservoir water into the fault and the reservoir decline between December 2007 and May 2008 as “major factors associated with the nucleation of the great Sichuan earthquake.”

    Fan also does not see the Zipingpu-Wenchuan connection as proven yet, but he's seen enough to urge caution. “We should readjust our existing plans and take a more cautious attitude when planning projects,” he says. “But I am pessimistic that many of these large-scale constructions will be canceled, because of the strong economic interests that benefit hydropower developers and local governments.”

    Building a stronger case for restraint, researchers in and out of China say, will require access to even more detailed data. “Time-variation evidence for seismicity of small earthquakes near and surrounding the reservoir, as well as for the water levels and loading of the reservoir, are needed,” says geophysicist Wen Xue-ze of the Sichuan Seismological Bureau in Chengdu. Fan believes that researchers in the Chinese Academy of Sciences have preliminary results from such studies, “but they are reluctant to share them.”


    Waltzing Quasars Provide Signpost to Merging Galaxies

    1. Yudhijit Bhattacharjee



    Astronomers currently spot merging galaxies such as this pair—known as the Mice Galaxies—by their anomalous shapes.


    Astronomers want to know how frequently galaxies merge to understand the broader evolution of the universe. A team reported here at the annual meeting of the American Astronomical Society that it has found a new way of spotting galaxy mergers: by looking for the dance of death of their central black holes. Julia Comerford of the University of California, Berkeley, and her colleagues claim their findings add strength to the idea that merged galaxies can form quasars.

    Traditionally, astronomers find mergers by scanning the skies for oddly shaped galaxies or galaxies close together, apparently on the point of merger. Because most galaxies have a supermassive black hole at their center, Comerford and her team reasoned that galaxies that had merged recently would have two black holes spiraling inward. If the black holes had enough gas around them, the gas would collapse inward, releasing energy and converting one or both black holes into active galactic nuclei (AGNs), or quasars.

    So, the astronomers searched spectra from the DEEP2 Galaxy Redshift Survey—the most detailed picture of the universe from as far back as 8 billion years ago—to find the tell-tale light signature of AGNs. (They excluded galaxies with ongoing star formation because this mimics the AGN signature.) The researchers found 107 candidates and then examined their spectra for signs of a merger.

    Comerford's team figured that in a recently merged galaxy, the two spiraling AGNs would be moving a lot quicker than the stars in the surrounding galaxies, and so the AGNs' emission lines would be Doppler-shifted out of step with the rest of the galaxy. Because the two AGNs would move in opposite directions, the emission lines of the first AGN would be shifted one way and the lines of the second the other, resulting in a double peak. That's exactly what the researchers found in two of the 107 galaxies, leading them to conclude that these were dual AGNs.

    In another 35 galaxies, the emission lines appeared to be from a single AGN but were offset from the lines of the surrounding galaxy, suggesting an AGN spiraling around a black hole. So 37 galaxies in all—more than a third of the AGN-bearing galaxies in the study—seemed to have undergone a recent merger. From this, the researchers concluded that galaxies from between 4 billion and 7 billion years ago underwent three mergers every billion years. Although that's six times the rate predicted by theoretical models, Comerford says it is close to previous estimates of merger rates derived from observations. She says the discrepancy with the models is because they only consider mergers between two big galaxies.

    Aaron Barth, an astronomer at the University of California, Irvine, says the new technique is “very useful because it's completely independent of other methods to estimate the merger rate.” The surprisingly high percentage of AGNs showing “offset nuclei” points to “a very clear link between galaxy mergers and the buildup of black holes by accretion of gas,” he says.


    Do Black Holes Seed the Formation of Galaxies?

    1. Yudhijit Bhattacharjee


    Which came first, the galaxy or the black hole? Astronomers have been pondering that question because most galaxies have massive black holes at their centers. Are the black holes the seeds around which galaxies grow, or do they form after the galaxies have already taken shape? The verdict from a study presented here at the annual meeting of the American Astronomical Association is that black holes come first. The findings could lead to a better understanding of how galaxies are born.

    Galaxies in the nearby universe appear to be uniformly about 700 times as massive as the black holes at their core. Many view this ratio as a clue to the link between galaxy formation and black holes. To understand that link, a team led by Chris Carilli of the National Radio Astronomy Observatory in Socorro, New Mexico, and Dominik Riechers of the California Institute of Technology in Pasadena set out to establish whether black holes lead to galaxy growth or vice versa.

    The researchers looked at distant galaxies, as far back as 1 billion years after the big bang. Carilli and his colleagues, using the Very Large Array radio telescope in New Mexico and the Plateau de Bure Interferometer in France, looked at the radio signals from gas clouds in four such galaxies and deduced their masses. They then compared each mass with the mass of their black holes, which had been obtained using optical telescopes. They found that the galaxies were only about 30 times as massive as their central black holes. If galaxies in the early universe had less than 5% of the galactic flesh around their cores than galaxies of today, Carilli and his colleagues reasoned, then the black holes must have formed before the galaxies.

    “It appears that the black holes come first and grow the galaxy around them,” says Carilli. Eventually, like a ripening fruit, the galactic mass grows until its ratio to the black hole mass reaches what astronomers observe in nearby galaxies. Carilli and his colleagues acknowledge that the results seem to contradict findings suggesting that the enormous energy emanating from active black holes inhibits galaxy growth around them.

    Astronomer Andrew Fabian of the University of Cambridge, U.K., calls the results “very interesting” and says they are “bound to generate theoretical work” on how black holes and galaxies influence each other. But he's reserving judgment on whether black holes did indeed come first because the conclusions are “deduced from gas motions, not stars,” and gas clouds can be affected by nongravitational forces such as magnetic fields, causing astronomers to underestimate the ancient galaxy's mass.


    Infection Study Worries Farmers, Bird Lovers

    1. Martin Enserink

    AMSTERDAM—Dutch researchers have embarked on a study in which they deliberately infect wild swans with avian influenza to study the effects on bird migration and viral spread. Although the researchers are using a virus that causes no detectable disease in birds and poses no risk to humans, the experiment has triggered consternation among poultry farmers and bird lovers, who worry that the virus could infect poultry or threaten wild birds. Concerns have reached all the way to the Dutch parliament.

    The researchers insist that the risks are vanishingly small because the experiment involves a low-pathogenic influenza subtype called H4N6 that is already ubiquitous in nature. They also point out that the protocol was rigorously reviewed and deemed safe, and the scientific payoff could be considerable.

    Scientists know relatively little about how avian influenza—including H5N1, the dangerous strain that has sparked fears of a worldwide pandemic—affects bird migration, which in turn is believed to influence the virus's epidemiology. In a 2007 study, researchers from the Netherlands Institute for Ecology (NIOO) in Heteren and the Erasmus Medical Center in Rotterdam showed that wild Bewick's swans naturally infected with low-pathogenic strains H6N2 and H6N8 fed less frequently and departed on their winter migration later than uninfected birds. But it wasn't clear whether being infected caused the birds to eat less, or vice versa.

    Trial or error?

    An experiment in which Bewick's swans (inset) are infected with bird flu has veterinarian and Member of Parliament Henk Jan Ormel worried.


    In the new study, the same two groups hope to answer that question. The protocol calls for 16 Bewick's swans to be caught at overwintering sites in the Netherlands, infected with H4N6, then released and tracked using a GPS collar for 18 months. Sixteen other birds serve as controls, while a third group will be injected with phytohemagglutinin, a compound that switches on the immune system in a way that mimics disease. Because H4N6 is so widespread, infecting 16 wild birds is like adding a drop of water to the ocean, says NIOO researcher Marcel Klaassen. Unlike the H5 and H7 strains, H4 has never been known to mutate to highly pathogenic versions or to pose a threat to humans.

    Even so, realizing that “this would be very sensitive,” the team cleared the study not just with an animal experimentation committee, as Dutch law requires, but also with officials at the Dutch health and agriculture ministries, says Albert Osterhaus, who leads the Erasmus group. The team also consulted independent virologists in the Netherlands and abroad.

    “I must admit [that] when I was first asked for a review of this, I was unsure of the ethical considerations,” says bird flu researcher Ian Brown of the Veterinary Laboratories Agency in Weybridge, U.K. But Brown became convinced that the study poses a negligible risk and has scientific merit.

    Others are not so sure. “It's quite something to deliberately release a virus into the wild,” says Henk Jan Ormel, a veterinarian and member of the Dutch House of Representatives. Ormel has asked Minister of Health Ab Klink whether he believes the study is safe for humans and poultry and if not, whether it can be banned. Trinus Haitjema, a Dutch amateur ornithologist who specializes in Bewick's swans, worries that the study may harm the species, whose numbers have dropped sharply the past 15 years.

    For now, however, the study is on hold, although not for safety concerns. Because of unusually cold temperatures in January, the researchers decided not to catch any more birds to avoid putting more stress on the animals. Unless an interim analysis of the data later this year yields clear-cut results, researchers hope to complete the study next year. By then, they hope any public opposition will have subsided.


    CDC's Gerberding Makes an Early Exit

    1. Jennifer Couzin

    Julie Gerberding, the first woman to lead the U.S. Centers for Disease Control and Prevention (CDC), is stepping aside after more than 6 years of a sometimes rocky tenure at the agency's helm. Her resignation, announced last week, came early but was not a surprise, given that the new Democratic Administration is looking for its own CDC chief. President-elect Barack Obama is expected to nominate his choice shortly; in the meantime, Gerberding's deputy, Chief Operating Officer William Gimson, who has a business degree and has spent more than 30 years at CDC, will fill in.

    Gerberding, an infectious-disease specialist, came to CDC during a tumultuous time, less than a year after the 9/11 terrorist attacks and anthrax mail attacks. Like other agencies such as the U.S. National Institutes of Health, CDC received a massive increase in bioterror defense funding, which Gerberding focused on boosting surveillance of and response to possible biological attacks and other terrorist threats. She also juggled a dizzying series of disease outbreaks and potential outbreaks, from SARS in China to monkeypox in the United States, to widespread fears of a global influenza pandemic that hasn't materialized.

    All these shifts left CDC more “politically visible,” says James Curran, dean of the Rollins School of Public Health at Emory University in Atlanta, Georgia, who spent years at CDC and helped lead the agency's response to HIV/AIDS in the early days of the epidemic. Several CDC experts noted that Gerberding was unusually adept at communicating with the public, an increasingly necessary task. “For those of us out in the field, far too often people from on high at CDC have made statements that meant nothing [to us]. Julie really tried to connect CDC with the average person,” says Michael Osterholm, director of the Center for Infectious Disease Research and Policy at the University of Minnesota, Minneapolis, and a longtime supporter of Gerberding—and, he hastens to note, of Barack Obama.

    Saying goodbye.

    Julie Gerberding is leaving CDC to make way for an Obama appointment.


    But Gerberding's leadership of CDC was also marked by tension and sharp criticism, especially after she launched an extensive reorganization of the agency. Morale among CDC scientists reportedly plunged, and five former CDC directors wrote Gerberding a letter 3 years ago expressing “great concern” about the departure of top scientists from the agency. In a sometimes testy interview with Science in late 2006, Gerberding defended the changes and expressed confidence that they were needed to help CDC tackle large-scale health threats (Science, 13 October 2006, p. 246).

    The agency is “clearly at a crossroads” now, says James LeDuc, who spent 14 years at CDC before leaving at the end of 2006 to join the University of Texas Medical Branch at Galveston. Whoever takes over next will have to ensure that young, talented scientists come and stay. And, adds Curran, they must prove adept at another critical task—communicating with Congress and the White House.


    Tracking CO2's Comings and Goings From Space

    1. Dennis Normile

    Climate scientists trying to better understand Earth's carbon cycles have long been hampered by tunnel vision. Ground-based carbon dioxide (CO2) monitoring is precise, but the 100-odd stations across the globe provide insufficient coverage, particularly in developing countries and over the oceans. Soon, however, there will be two eyes in the sky with all-encompassing views of this worrisome greenhouse gas. In the next few weeks, Japan and the United States plan to launch satellites to observe CO2 from space.

    The view from on high should lead to more accurate predictions of how rising CO2 levels might affect global temperatures and climate change. “This will also contribute to political decisions on [acceptable levels] of CO2 emissions,” says Tatsuya Yokota, who heads the satellite observation office of Japan's National Institute for Environmental Studies (NIES) in Tsukuba. In addition, patchy data have been a “barrier to coloring in the maps of CO2 sources and sinks,” says Peter Reyner, a climate modeler at the Laboratoire des Sciences du Climat et de l'Environnement near Paris. “There are large parts of the globe where this will be our first look [at CO2 data].”

    Japan will launch its Greenhouse Gases Observing Satellite (GOSAT) on 21 January. NASA's Orbiting Carbon Observatory (OCO) will follow on 23 February. Both intend to answer a fundamental question: Where is CO2 generated by human activities coming from and going to? Each year, humans dump about 9 billion tons of carbon into the atmosphere, but only half stays there, says David Crisp, principal investigator for the $270 million OCO at NASA's Jet Propulsion Laboratory in Pasadena, California. Of CO2 recycled from the atmosphere, about one-quarter is absorbed by land vegetation and another quarter is somehow drawn into the oceans. “We don't know where the other half is going,” Crisp says.

    Parallel views.

    Japan's GOSAT (left) and NASA's OCO will provide the first global views of CO2.


    How these carbon sinks might evolve as climate shifts in response to rising CO2 levels is also unclear. And scientists can't begin to fathom the missing sinks until they've been located.

    There are other mysteries, such as large variations in atmospheric CO2 concentrations from year to year. In 1973, virtually all of the 5 billion tons of carbon put into the atmosphere stayed there; but the following year, 4 billion out of 5 billion tons that were emitted got absorbed by sinks, Crisp says. In another riddle, in 1993, a major El Niño coincided with high rates of CO2 absorption; the link and mechanism are unclear. GOSAT, a $500 million joint effort of Japan's space and environment agencies and NIES, has a mission lifetime of 5 years (versus OCO's 2 years) because scientists want “to detect annual variations in CO2 [resulting from] El Niño, La Niña, and other weather phenomena,” Yokota says.

    GOSAT will also measure methane, a greenhouse gas for which there is even less data. Both missions might also contribute to understanding localized problems by helping pinpoint pollution sources.

    The satellites grew out of ongoing Earth-observation programs. Crisp says that the CO2 data gap was long recognized but that improved detection was beyond standard remote-sensing techniques. Previous satellite sensors for ozone worked at thermal or ultraviolet wavelengths, but thermal wavelengths don't work well for CO2. The new satellites will observe in near-infrared. “The measurement technique is one of the real innovations that OCO and GOSAT have had to make in order to move forward,” says Crisp.

    The two satellites will observe in different patterns; OCO will be more sensitive to fortnightly or monthly rhythms, whereas GOSAT will be better able to correlate CO2 levels with changing weather patterns. “The data will be highly complementary,” says Yokota. That kind of stereovision might be just what it takes to spot those missing carbon sinks.


    Astronomy's Greatest Hits

    1. Tim Folger*
    1. Tim Folger is a contributing editor at Discover and On Earth magazines and lives in Gallup, New Mexico.

    In honor of the International Year of Astronomy, Science presents a timeline showing what we've learned in the 400 years since the invention of the telescope.

    When Galileo first pointed his telescope at the moon in 1609, the light from some of the discoveries mentioned in these pages was just passing the Pleiades star cluster, some 400 light-years from Earth. (The moon's light took about 1.25 seconds to reach Galileo's telescope.) During those 400 years, astronomers learned to capture light from ever-greater distances, revealing a universe that became (and continues to become) ever larger and stranger. Here is a sampler of what we've learned from 4 centuries of harvesting photons.

    Download the PDF


    A Brief History of the Telescope

    1. Larry Gonick,
    2. William Alschuler

    Writer/illustrator Larry Gonick and astronomer William Alschuler present a comic strip history of the telescope as part of Science's special Focus package commemorating the International Year of Astronomy.

    Writer/illustrator Larry Gonick and astronomer William Alschuler present a comic strip history of the telescope as part of Science's special Focus package commemorating the International Year of Astronomy.

    Download the PDFM


    Astronomy Hits the Big Time

    1. Adrian Cho,
    2. Daniel Clery

    Four hundred years after the invention of the telescope, astronomy is flourishing. But even as the discoveries keep coming, the field is rapidly evolving toward huge telescopes, large collaborations, and--alas--bigger headaches.

    Four hundred years after the invention of the telescope, astronomy is flourishing. But even as the discoveries keep coming, the field is rapidly evolving toward huge telescopes, large collaborations, and—alas—bigger headaches

    Because anyone can search the sky, astronomy remains the most democratic of sciences—perhaps the only one in which an amateur can still make a bona fide discovery. In August 2007, Hanny van Arkel did just that. The primary-school teacher from Heerlen, the Netherlands, spotted a strange blue blob in the sky. The intergalactic ghost turned out to be an enormous cloud of gas that is reflecting the light lingering from a now-dead quasar in a nearby galaxy to create a never-before-seen “light echo.” The discovery of Hanny's Voorwerp (Dutch for Hanny's Object) earned Van Arkel, 25, a moment of fame. “My name was all over the world, and that's fun,” she says.

    At the same time, the discovery highlights dramatic changes within astronomy. Van Arkel made her find not by looking through a telescope—she doesn't own one—but by viewing on her computer some of the millions of images of galaxies captured by the Sloan Digital Sky Survey, an 8-year-old project cataloging everything that can be seen in a vast swath of sky with a 2.5-meter telescope on Apache Point in New Mexico. Van Arkel is one of more than 160,000 volunteers helping to classify 1 million galaxies as part of an outreach program called Galaxy Zoo.

    Data factory.

    The Sloan Digital Sky Survey's 2.5-meter telescope catalogs all it can see. An amateur found the strange “light echo” known as Hanny's Voorwerp (inset, green) in the mounds of data.


    Four hundred years after the invention of the telescope, the heavens continue to yield mind-bending surprises: stellar explosions called gamma-ray bursts that momentarily outshine the rest of the entire universe in gamma-ray light, the bizarre dark energy that is stretching space and speeding the expansion of the universe, and strange planets orbiting other stars.

    But even as the science flourishes, the practice and culture of astronomy are changing. Telescopes have grown steadily over the centuries, but the ones now in planning are truly immense—optical behemoths with mirrors measuring 30 meters or more across, and radio-telescope arrays spanning thousands of kilometers. Their costs will be measured in billions of dollars apiece. Meanwhile, some researchers are performing huge surveys that take a whole new approach to collecting data, spotting everything in sight and recording it all in vast computerized databases. Already considered “big science,” astronomy is rapidly growing much bigger. And with that growth comes some of the headaches that plague other fields: increasing competition for limited resources and longer times to see projects completed.

    “We shouldn't lose sight of the fact that it's good to have this problem,” says Roger Blandford, a theoretical astrophysicist at Stanford University in Palo Alto, California. “It's a sign that astronomy is as intellectually and scientifically exciting as it's ever been.” Michael Turner, a cosmologist at the University of Chicago in Illinois, agrees: “This is a very special time in astronomy, when you finally know enough about the universe to ask a variety of big questions, and you have the tools to go after the answers.”

    Still, as astronomers reap an ever-greater understanding, they may be losing the romance of their craft. As computerized data streams and remote-controlled observatories become the norm, the lone astronomer trekking to the mountaintop for a night of observing is quickly becoming a quaint anachronism.

    A variety of riches

    Strikingly, as astronomers have learned more, their field has continued to grow more diverse. Whereas scientists in fields such as particle physics have homed in on a few key conceptual questions—is there a Higgs boson?—astronomers find themselves blessed with an ever-longer list of mysteries ripe for exploration: What's speeding up the expansion of the universe? How did the first galaxies form? Where do cosmic rays come from? What is the nature of the black hole in the middle of our galaxy?

    Much of this progress has been driven by technology. Astronomers have continued to improve their ability to detect the electromagnetic radiation of various wavelengths that emanates from stars and galaxies, from the advent of radio astronomy in the 1940s, to the birth of x-ray astronomy in the 1960s, to precision studies of the microwaves lingering from the big bang starting in the 1990s.

    Most recently, the high-energy gamma-ray universe is coming into focus. These energetic photons are too rare to be picked up by the small telescopes of orbiting observatories and are blocked by the atmosphere. But when one strikes the atmosphere, it sets off an avalanche of electrons and other charged particles. As they zip through the air, these particles produce a pulse of light called Cerenkov radiation that special ground-based telescopes can detect. In 2004, the four telescopes of the High Energy Stereoscopic System (H.E.S.S.) in Göllschau, Namibia, became the first to spot a source of high-energy gamma rays shining in the sky (Science, 3 September 2004, p. 1393; 5 November 2004, p. 956).

    Other Cerenkov telescopes, including VERITAS in the United States and MAGIC in the Canary Islands, have joined the search, and astronomers now have more than 100 sources to study. Such high-energy gamma rays are thought to originate in violent events such as supernovas, gamma-ray bursts, and supermassive black holes sucking in matter at the hearts of distant galaxies. “People are very, very excited about progress with the new telescopes,” says astrophysicist Masahiro Teshima of the Max Planck Institute for Physics in Munich, Germany, who works on MAGIC. “Their information gives us a clear picture of what is happening in these violent sources.”

    Novel approaches have also opened realms of discovery. Michel Mayor of the University of Geneva, Switzerland, developed a way to search for parts-in-10-million variations in the color of starlight, such as those a planet can produce by tugging on its star. The frequency of the light increases as the star is pulled toward Earth and decreases as it is pulled away, just as the pitch of a siren rises as a police car approaches and falls as it speeds away.

    In 1995, Mayor and colleagues detected the first planet beyond our solar system—a Jupiter-sized giant whizzing around its star once every 4.2 days. Astronomers have since discovered more than 300 extrasolar planets. Those often bizarre other worlds have brought with them more questions than answers, Mayor says, because “we have discovered that our own solar system is in no way typical of what can form in the universe.”

    Astronomy has also begun to attract more scientists from other fields. They include geologists interested in planet formation, biologists seeking the chemical precursors of life, and—perhaps most notably—particle physicists fascinated by the mystery of dark energy. Since 1998, astronomers have known that some sort of space-stretching energy is accelerating the expansion of the universe, and many physicists are determined to find out what that stuff is.

    Stellar strip mining

    Near the 2788-meter summit of Apache Point, a telescope juts above the spindly pines, its rectangular casing resembling a big cardboard box. With a 2.5-meter mirror, it isn't a large telescope. Yet perhaps no other is changing the practice of astronomy as dramatically as this one, which feeds the Sloan Digital Sky Survey, a $150 million effort supported by the private Alfred P. Sloan Foundation, federal agencies, and the 25 participating institutions.

    Traditionally, astronomers have taken turns using telescopes, with individuals or small teams applying for observing time to prospect for the astronomical gems that intrigue them. In contrast, since 2000, the 150 members of the Sloan team have worked together to spot all they can in a quarter of the celestial sphere, including 100 million galaxies. Smaller teams then sift through the data from this celestial strip-mining operation before they are eventually made public.

    Flash photography.

    The four telescopes of H.E.S.S. in Namibia look out for the flashes caused by gamma rays hitting the upper atmosphere.


    “People giggled when we put out papers with 100 authors,” says Michael Strauss of Princeton University. “But we showed that that many astronomers could get along without killing each other and [that] a large survey could be enormously scientifically productive.” For example, Sloan has traced the distribution of galaxies, revealed the structure of our own Milky Way galaxy, and helped explain the origins of different types of asteroids.

    Michael Strauss of Princeton University


    The Sloan survey also melds scientific cultures. Particle physicists from Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, have provided key expertise in handling both large collaborations and huge data streams, says David Weinberg, an astronomer at Ohio State University in Columbus. “Without the resources that Fermilab had to bring, I don't think the project would have succeeded,” he says.

    The Sloan survey is just the first of several enormous surveys in the works. The proposed Panoramic Survey Telescope & Rapid Response System (Pan-STARRS), a $100 million array of four 1.8-meter telescopes whose construction is funded by the U.S. Air Force, would mainly spot asteroids that menace Earth, but it would also do survey work surpassing Sloan in sensitivity. A prototype telescope is already working on Haleakala in Hawaii. And the planned Large Synoptic Survey Telescope (LSST), a $400 million instrument that would be paid for by the U.S. National Science Foundation (NSF) and private contributors, would sit atop Cerro Pachón in Chile and use a 3.2-gigapixel camera to image an entire hemisphere of the sky once every 3 nights, spotting 2 billion objects. That's too many images for humans to look at, so instead they'll have to be “viewed” by specially programmed computers.

    To be sure, astronomers have surveyed the skies before. From 1948 to 1958, astronomers at the Palomar Observatory in southern California used a 1.22-meter telescope to produce 937 photographic plates that others used for decades to guide their searches. But the new surveys aim to not only provide better data but also tackle new types of statistical studies.

    Following Sloan's lead, Pan-STARRS and LSST will trace the three-dimensional distribution of galaxies and their apparent orientations on the sky. Those measurements should reveal the interplay between space-stretching dark energy and dark matter, the mysterious stuff whose gravity holds galaxies together. Only digital cameras and high-power computing make that statistical approach practical, says LSST's project leader, Anthony Tyson of the University of California, Davis.

    Some astronomers say that after such statistical measurements are finished, attention will move away from huge surveys. Others argue that the push toward surveys will permanently change the character of astronomy. “It's here to stay, and it will grow as an overall fraction of the field,” Weinberg says.

    Thinking big.

    Europe's planned Extremely Large Telescope, with its 42-meter mirror, will study the chemistry of planets around other stars.


    Watchful giants

    Astronomers generally agree that in the future many of them will continue to work in smaller teams to seek out and study individual astronomical objects. That's because “often it is the best-case example [of an object] that tells you the most,” says Britain's Astronomer Royal, Martin Rees of the University of Cambridge. But those observations may be made with gargantuan telescopes, and such giants will likely change the practice of observing the sky.

    The size of optical telescopes has grown steadily over the past century, from the 60-inch (1.5-meter) telescope at Mount Wilson in California, inaugurated in 1908, to the Large Binocular Telescope at Mount Graham, Arizona, whose twin 8.4-meter mirrors saw first light in 2007. But the next growth spurt will be a big one, as telescope designers have mastered combining many small reflectors into one huge segmented mirror. In the United States, two such extremely large telescopes (ELTs) are in preparation. Grinding has begun on the seven 8.4-meter mirrors that will make up the equivalent of a 24.5-meter reflector for the Giant Magellan Telescope to be built at Las Campanas in Chile, and the Thirty Meter Telescope, with a single giant segmented mirror, is in its design phase. Meanwhile, European astronomers are designing their own European-ELT (E-ELT) with a 42-meter segmented mirror.

    Although such gigantic telescopes may be shared in the traditional way, most astronomers won't enjoy the usual hands-on interaction with the machinery. For example, at the European Southern Observatory's Very Large Telescope—an array of four 8.2-meter telescopes on Cerro Paranal in Chile—40% of observations are carried out by astronomers who come to the site on allotted nights, says ESO's Roberto Gilmozzi, E-ELT principal investigator. With a giant such as E-ELT, that will be difficult.

    Instead, with thousands of astronomers clamoring for observing time, the scheduling of observations and steering of the telescope are likely to be fully automated to squeeze out every useful second. No astronomer will need to travel to the mountain—or will be encouraged to do so. In fact, Rees says, the largest optical telescopes will likely be run much like the orbiting Hubble Space Telescope, entirely by remote control.

    The notion of a small observing team may also change. The competition for use of the telescope will push astronomers into larger and larger collaborations. Astronomers already gang together for projects that need a lot of observing time to gather large statistical samples. “These will evolve into even larger collaborations,” says Gilmozzi, as researchers design elaborate campaigns to make the most of scarce time.

    Radio astronomers are going through a similar process with plans for the Square Kilometer Array (SKA), whose thousands of networked dishes will include some more than 1000 kilometers from the array's compact core (Science, 18 August 2006, p. 910). Currently in its design stage, SKA will be built in either South Africa or Australia and will scan the sky more than 10,000 times faster than is currently possible. That ability means that large-scale surveys will become an even higher priority. “Surveys will be very big and will take up a lot of the telescope's time,” says SKA Director Richard Schilizzi. This will lead to sociological changes, “but I don't know how we'll handle that yet,” he says. “Will we have papers with 400 authors? I don't know.”


    Tighter budgets and stiffer competition

    While the size and cost of big telescopes are soaring, national astronomy budgets aren't keeping pace. So the competition for limited resources is growing fiercer. Such budget constraints are already starting to pinch. In 2005, NSF, which funds ground-based efforts, requested that astronomers conduct a “senior review” to free up $30 million in its $200 million annual astronomy budget. Among other cuts, the 2006 review recommended that NSF stop funding the gigantic and storied Arecibo radio observatory in Puerto Rico in 2011.

    The painful process was necessary, says Craig Foltz, acting director of NSF's division of astronomical sciences, because the agency did not have enough money to develop new projects such as LSST and an ELT or to pay the United States's share for operating the Atacama Large Millimeter Array (ALMA), a $1 billion international observatory under construction in Chile.

    European astronomers may face similar decisions as they enter the era of E-ELT and SKA. Just 2 months ago, they completed a road map of desired projects for the next 2 decades; now they are waiting to see how funders react to the requested 20% funding increase over the next 10 years (Science, 28 November 2008, p. 1313). “Even if you closed all the smallest telescopes, it doesn't make any inroads into the cost of the E-ELT,” says astrophysicist Michael Bode of Liverpool John Moores University (LJMU) in the U.K., who headed the road-map effort. “We'll need to increase budgets overall for the largest [projects].”

    As telescopes get bigger, the time needed to complete them is stretching, too. Each decade, U.S. astronomers rank their top future projects in a “decadal survey” sponsored by the U.S. National Research Council. None of the four large ground-based projects in the 2001 decadal survey has been completed yet, notes Wendy Freedman, an astronomer with the Observatories of the Carnegie Institution of Washington in Pasadena, California, and chair of the NSF-NASA Astronomy and Astrophysics Advisory Committee. Compounding matters, large projects are often international efforts involving multiple funding agencies. “We need to find a mechanism for getting all these funding agencies and countries to work together,” Freedman says. “That's a lot more complex” than working with a single funding agency.

    Those complexities became apparent in the development of ALMA, an array of 66 millimeter-wave dishes taking shape 5000 meters up at Llano de Chajnantor, a plateau in Chile's Atacama Desert. Widely regarded as the first worldwide collaboration in astronomy, the project originated when U.S. and European astronomers merged two rival projects in 1999. Japan joined in 2001.

    But getting the U.S.-European part of the project to work has not been easy. Early on, it was run by two management teams in Garching, Germany, and Charlottesville, Virginia, but that arrangement proved cumbersome and decision-making was slow. The project also found itself buffeted by factors outside its control, such as rapidly rising prices of steel, copper, and labor, and problems with prototype antennas provided by suppliers. As a result, in 2005, managers were forced to ask for a 40% budget increase to about $1 billion (Science, 19 May 2006, p. 990).

    U.S. and European funders eventually agreed to the increase. To streamline the project's management, the partners set up a joint team in Santiago, Chile's capital, to make day-to-day decisions. ALMA was then back on track, and the first of its dishes, most of them 12 meters across, was installed last month. Still, managers are feeling pressure to prove that such an effort can work. “We have a big burden on our shoulders,” says ALMA Director Thijs de Graauw. “If ALMA works, it will neutralize many of the critical comments about the size and global nature of such projects.”

    Looking up.

    The first dish of 66 that will eventually make up the 16-kilometer-wide ALMA array in Chile, the first global observatory.


    Dreads and dreams

    Some researchers have misgivings about the push toward huge telescopes and large collaborations. Astronomy may be starting down the “dinosaurs' path to extinction” taken by particle physics, a field that now consists of a few huge experiments with hundreds or thousands of collaborators, says Simon White, a theorist at the Max Planck Institute for Astrophysics in Garching. Bigger telescopes aren't necessarily more productive, White says, but researchers push for them because it's clear how they improve on their predecessors. “You can't plan for scientific discovery, but you can plan for technological advances,” he says.

    White also worries that large collaborations encourage researchers to overspecialize. “More and more people are trained basically to develop software for a specific measurement and not to look for something new,” he says. De Graauw sees a related problem. “Astronomers are getting so used to getting everything by computer that they're removed from reality,” he says. “It's getting hard to find astronomers who are familiar with instruments.” Others say the push to bigger teams and increased specialization is inevitable if the science is to advance. “The whole field is becoming much more professional,” says Bruno Leibundgut, ESO's science director. “If you want to build an instrument for a big telescope, you can't just go to a funding agency and say, ‘Here's our idea, and if it doesn't work, too bad.’ You need quality assurances and controls that four guys at an institute may not have.”

    Many say that small groups working with small telescopes still have plenty of opportunities to make real contributions. For a start, smaller observatories are important training grounds for younger astronomers and test beds for new ideas and techniques. “Training is a lot easier with a backyard instrument,” Schilizzi says. But smaller scopes need to do more than that to justify their continued existence. “They're never going to compete with big telescopes in extracting a spectrum of a faint object,” says LJMU's Bode, so instead they have to find niches that play to their strengths.

    For example, small scopes are good for reacting quickly to fast-moving events in the sky. Such agility is the bread and butter of the 2-meter Liverpool Telescope operated by the Astrophysics Research Institute at LJMU. If, say, an orbiting observatory sends word that it has sighted a new gamma-ray burst, the fully automated scope drops what it is doing, swings to the patch of sky indicated in the message, and compares what it sees with an image from an online database. If something interesting is afoot, it will start snapping pictures. “This can be done within a minute or two of a gamma-ray burst,” Bode says—probably faster than a bleary-eyed astronomer could react.

    Although such uses may provide a role for smaller telescopes in the future, they still don't require an astronomer to make the long trip to a distant mountaintop and spend a cold night gazing at the stars. “An important romantic part of the field is disappearing,” De Graauw says. Although astronomers are unlikely to complain as long as they can do the science they want, it's hard not to feel a pang of regret. Peer through your own telescope, and starlight millions of years old passes into your eye. It's an experience that inspires awe and wonder. With the advent of automated observatories and computerized data streams, it's also one that bears ever less resemblance to the practice of astronomy.