News this Week

Science  09 Nov 2012:
Vol. 338, Issue 6108, pp. 726
  1. Around the World

    1 - Cambridge, U.K.
    British Antarctic Survey Keeps Its Identity
    2 - Dehradun, India
    Science Unclear on GM Crops
    3 - Laos
    Megadam Gets Green Light
    4 - London
    Journal Pushes for Transparency

    Cambridge, U.K.

    British Antarctic Survey Keeps Its Identity

    CREDIT: WIKIMEDIA COMMONS/VINCENT VAN ZEIJST

    U.K. scientists were relieved by a decision last Friday to stop a highly unpopular proposed merger between the celebrated British Antarctic Survey (BAS) and the National Oceanography Centre. The Natural Environment Research Council (NERC), which oversees both centers, proposed the merger earlier this year as a cost-cutting measure. Researchers strenuously objected, arguing that the merger was ill-conceived and might hamper BAS's mission; three BAS senior staff members left the survey earlier this year, possibly because of conflicts over the proposal. The furor prompted NERC to make its decision last week, a month sooner than planned. On 31 October, the House of Commons Science and Technology Committee urged NERC to cancel the merger on the grounds that NERC hadn't made its case.

    Despite the merger's cancellation, ongoing uncertainty about the survey's future funding and its current leadership vacuum have cast a pall over what was supposed to be a banner year for BAS, including plans to drill into a subglacial Antarctic lake in December. NERC has committed itself to funding BAS at £42 million annually through 2015. http://scim.ag/BritAnt

    Dehradun, India

    Science Unclear on GM Crops

    Reddy

    CREDIT: WWW.INDIA.GOV.IN

    At a 29 October press conference, India's new science minister, S. Jaipal Reddy, took a skeptical stance on one of the country's hottest issues: genetically modified (GM) crops. In response to questions from Science, Reddy said “the science is not clear.”

    Reddy, a graduate in English and a career politician who is known as one of India's finest orators, took over as science minister on 29 October in a cabinet reshuffle by Prime Minister Manmohan Singh. His new job will require him to navigate the highly charged debate on GM crops. The prime minister's Scientific Advisory Council expressed support for biotechnology in an early-October report. But on 17 October, an expert panel appointed by India's Supreme Court to advise it on a GM case recommended a 10-year moratorium on GM crop field trials.

    “Not having a science background should not be a handicap for the new minister,” says physicist Ajay Sood, president of the Indian Academy of Sciences in Bangalore. But he says scientists expect the influential minister to have a “positive outlook to science.” http://scim.ag/SciMinGM

    Laos

    Go ahead.

    Near the site of the proposed Xayaburi dam on the Mekong River.

    CREDIT: AP PHOTO/INTERNATIONAL RIVERS, PIAPORN DEETES

    Megadam Gets Green Light

    Laos has given builders a green light to finish work on a controversial megadam at Xayaburi on the lower Mekong River, after a diplomatic pause this year to consult with neighboring leaders in Cambodia and Vietnam who were concerned about potential ecological impacts (Science, 12 August 2011, p. 814). Xayaburi, which is projected to cost more than $3.5 billion and generate 1285 megawatts of power, is primarily Thai-financed and will export most of its electricity to Thailand, although a portion of the energy will be resold to other buyers. Laos hopes the project will give its developing economy a huge boost. But opponents say this dam and 11 more now in planning could massively disrupt the lower Mekong's ecology, slowing the cyclical water movements that replenish the soil and sustain the region's migratory fish.

    London

    Journal Pushes for Transparency

    To help open up drug company data vaults, BMJ announced a new policy last week: Unless authors agree to share raw clinical information, their papers will not be considered for publication. In a sharply worded editorial announcing the policy, BMJ Editor Fiona Godlee decried the “wilful distortion” of science by scientists who “hide data, mislead doctors, and harm patients” by sitting on data useful for the study of the safety and efficacy of treatments. BMJ has been pushing for the past 3 years to obtain private clinical data on Roche's antiviral drug oseltamivir (Tamiflu). Starting next January, BMJ will require all authors to “make the relevant anonymised patient level data available on reasonable request.” Pharma giant Glaxo SmithKline has already adopted a similar policy, saying it will share anonymized patient records. The trend to open up data will accelerate, said Robert Califf, research vice chancellor of Duke University in Durham, North Carolina, and an expert on clinical trials, in an e-mail. Califf believes this will benefit society, although he says the details still need to be worked out.

  2. Newsmakers

    Marine Biological Laboratory Appoints New President

    Ruderman

    CREDIT: BACHRACH STUDIOS

    Harvard Medical School cell biologist Joan Ruderman is the new president and director of the Marine Biological Laboratory (MBL) in Woods Hole, Massachusetts. The announcement, made on 3 November, marks a first for MBL: Ruderman is the first woman to head the institution since its founding in 1888. She has been affiliated with MBL since 1974, when Ruderman took an embryology course that “changed my life in so many ways,” she says, including introducing her to the Spisula organism that led to her discoveries of molecular drivers of cell division.

    Although MBL has struggled financially, Ruderman says its situation is no different from that of other research institutions, and she points to a recent fundraising campaign that exceeded its $125 million goal. As director, Ruderman says she hopes to expand MBL's programs to encompass environmental research at the molecular and cellular levels, including research on the developmental origins of adult-onset diseases such as breast cancer.

  3. Random Sample

    Downing Biodiversity Lessons

    CREDITS: AILEEN CROSSLEY (2)

    Biodiversity has driven doctoral candidates from Trinity College Dublin to drink. Since 5 November, the students have been entertaining patrons in Dublin's bars with snappy, 3-minute talks meant to expand on information printed on beermats that they distributed to 10 local pubs in August and again in November.

    “We want to capture people's interest and hope to stimulate discussion in the pub,” says Trinity doctoral candidate Erin Jo Tiedeken, who studies insect pollinators and toxic compounds in nectar.

    The colorful coasters invite contemplation of nuggets including the number of bees needed to produce the apples in a pint of cider, the river water in local brews, and Ireland's cold water coral reefs. Part of the “Biodiversity In Our Lives” project, the students hope to instill the message that biodiversity is valuable to everyday life, including common tipples.

    The students initially distributed their beermats to bars frequented by students and professors. But they also targeted working-class pubs. “At first they gave us the strangest looks, but one of the bartenders phoned up and said his buddy in another pub wanted some of our beermats,” Tiedeken says.

    By reaching out to a wider audience, Tiedeken and her colleagues hope to make the university, and their research, more accessible. They plan to further expand their project, having secured funding to print an additional 20,000 beermats. They'll distribute them, along with their flash lectures, to 20 more pubs throughout November.

    ScienceLive

    Join us on Thursday, 15 November, at 3 p.m. EST for a live chat on a hot topic in science. http://scim.ag/science-live

  4. Meteorology

    Weather Forecasts Slowly Clearing Up

    1. Richard A. Kerr

    Ever-increasing computer power and new kinds of observations are driving weather prediction to new heights, but some kinds of weather are still not yielding.

    Spot on.

    An experimental high-resolution NOAA model produced this strikingly accurate forecast of the “D.C. derecho” (higher winds are orange and white) 12 hours ahead of the storm's arrival in D.C.

    CREDIT: EARTH SYSTEM RESEARCH LABORATORY/NOAA

    The machines aren't just challenging weather forecasters—they're taking over. Predicting tomorrow's weather, or even next week's? Computer models fed by automated observing stations on the ground and by satellites in the sky have had the upper hand for years. Predicting where a hurricane will strike land in 3 days' time? Computer models have outperformed humans since the 1990s, some by larger margins than others (see p. 736).

    Into next week.

    The rising skill of the European Centre for Medium-Range Weather Forecasts' model has made forecasting the globe's weather more than 8 days ahead worthwhile.

    CREDIT: ECMWF

    “People like to joke about predicting the weather,” says meteorologist Kelvin Droegemeier of the University of Oklahoma, Norman, “but they have to admit that forecasting is a lot better than it used to be.” And they can thank vast increases in computing power as well as technological advances, such as sharper-eyed satellites and advanced computer programming techniques.

    But the machines have yet to complete their takeover. Human forecasters can usually improve, at least a bit, on numerical weather predictions by learning the machines' remaining foibles. And humans still do better than computer models at some tasks, including forecasting the fits and starts of evolving hurricanes. But that's not saying much; human forecasts of hurricane intensity are hardly more skillful today than they were 20 years ago.

    Researchers working on the forecast problem say they are closing in on breakthroughs that could soon put the machines out front in even the toughest areas of forecasting. “We're not just talking,” says Alexander MacDonald of the National Oceanic and Atmospheric Administration's (NOAA's) Earth System Research Laboratory (ESRL) in Boulder, Colorado, “we're building future models and seeing some spectacular results.”

    On to next week

    Nothing illustrates the computer-driven rise of weather forecasting skill like the 3-decade-long track record of the European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading, U.K. Since 1979, ECMWF has been feeding the couple billion weather observations made each day into the most sophisticated forecast model available and running that model on the most powerful computer in the business, all in order to predict the weather around the globe as accurately and as far into the future as possible.

    From the beginning, ECMWF has been the world champ of medium-range forecasting. In 1980, its forecasts of broad weather patterns—the location and amplitude of atmospheric highs and lows—were useful out to 5.5 days ahead. Beyond then, the computer forecast became useless as the atmosphere's innate chaos swamped the model's predictive powers. Today, ECMWF forecasts remain useful into the next week, out to 8.5 days. That leaves the rest of the forecasting world, including the U.S. National Weather Service (NWS) with its less powerful computer, in the dust by a day or more.

    With the help of ever-improving computer models, forecasters are also making progress on an even tougher sort of prediction: where and when heavy rain and snow will fall. NWS forecasters have nearly doubled their skill at forecasting heavy precipitation, but it has been a long haul. They measure success on a scale of 0—complete failure—to 1, a perfect forecast. In 1961, they scored 0.18 on forecasting where 2.5 centimeters or more of rain (or 25 cm of snow) would fall a day later. Over the next 50 years, their score staggered up, never stagnating for much more than 5 years, until it stands at about 0.33 out of 1.

    All hail the computer

    Forecasting global weather patterns, heavy rain and snow, and any number of other sorts of weather better and further into the future has depended heavily on increasing computer power. When numerical weather prediction began in the mid-1950s, NWS's computational capacity was a meager 1 kiloflop (1000 calculations per second). It's now 108 megaflops, an increase of a factor of 100 billion.

    Ever lower.

    Errors have long been declining in NHC's official forecasting of where a hurricane will be 1 to 5 days ahead. That has allowed forecasters to warn the public earlier to flee the destruction of the coming storm.

    CREDITS (TOP TO BOTTOM): NATIONAL HURRICANE CENTER/NOAA; ANDREA BOOHER/FEMA PHOTO

    Forecasters have plenty of uses for the added computing power. The most straightforward is sharpening a forecast model's view of the atmosphere. Models work by calculating changes in air pressure, wind, rain, and other properties at points on a globe-spanning grid. In principle, the more points there are in a given area, the more closely the model's weather will resemble the real weather.

    In the early days of numerical weather prediction, grid spacing—or resolution—was something like 250 kilometers, far too coarse for individual thunderstorms. Today, thanks to increased computer power, grid points in global models are 15 kilometers to 25 kilometers apart. Over areas of special interest, resolution can be even greater. In the NWS global model, a grid with 4-kilometer spacing can be laid or “nested” on the lower 48 states of the United States.

    So far, every increase in operational model resolution has produced more realistic simulations of the weather and thus more accurate forecasts, says William Lapenta, acting director of NWS's Environmental Modeling Center in College Park, Maryland. Now NWS's goal, he says, is to use model forecasts to warn the public about the most severe weather threats—which happen to be on the smallest scales—as soon as the models predict them. For that, the models will need to get down to 1-kilometer resolution.

    Beyond increasing resolution, forecasters have used added computing power to improve a forecast model's starting point. Observations from weather stations, weather balloons, and satellites must be fed into a model to give it a jumping-off point for forecasting. But a model's intake, or “assimilation,” of observations has never been optimal.

    The assimilation of weather satellite observations that began in the 1970s was especially far from ideal. Satellites measure atmospheric infrared emissions at various wavelengths, and these observations were converted into temperature, pressure, and humidity values like those returned by weather balloons. “That didn't work very well,” says meteorologist James Franklin of the NWS's National Hurricane Center (NHC) in Miami, Florida.

    Beginning in the 1990s, instead of converting satellite observations to familiar atmospheric properties, forecasters began to assimilate the satellite observations directly into models without converting them. That improved forecasts all around, but especially forecasts of where hurricanes are headed. “A lot of the success in hurricane track forecasting is using the satellite data in a more intelligent, natural way,” Franklin says.

    That's because hurricanes don't propel themselves. “To first order, a hurricane is moving like a cork in a stream,” notes meteorologist Russell Elsberry of the Naval Postgraduate School in Monterey, California. The better the forecast for the stream—the atmosphere's flow for thousands of kilometers around—the better the forecast for a hurricane's eventual track.

    Thanks to improving models, NHC hurricane forecasters have made great strides in track forecasting. “In the 1970s, the best forecast was what human forecasters could do,” says meteorologist Mark DeMaria, who works for NOAA (NWS's parent agency) at Colorado State University in Fort Collins. But by the 1990s, models forecasting hurricane tracks surpassed human performance.

    Today, NHC forecasters consult a half-dozen different models before predicting a hurricane's position 3 days into the future. Those forecasts now have an error of 185 kilometers. In the 1970s, the 3-day error was about 740 kilometers. That quartering of track error has enabled NHC forecasters to issue hurricane warnings 36 hours ahead instead of 24 hours ahead, giving coastal residents 50% more time to evacuate.

    When Hurricane Sandy began brewing in October, the models consulted by NHC converged on a serious threat to the U.S. Northeast several days before landfall, though ECMWF modeling gave an inkling a whopping 10 days ahead.

    Making it real

    Other, less computationally demanding improvements have also brought model calculations closer to reality. For example, every weather forecast model in routine use today has a serious problem with simulating vertical air movement. In effect, their equations impose a speed limit on vertical motions. That is a particular problem in simulating the so-called supercell thunderstorms that can generate tornadoes and the violent winds near a hurricane's eyewall.

    So researchers from the National Center for Atmospheric Research in Boulder and elsewhere spent 15 years developing the Weather Research and Forecasting Model, or WRF (pronounced “worf”). Its equations of motion allow rapid vertical acceleration of air when cold, heavy air rushes downward to become highly destructive surface winds.

    In ongoing experimental forecasts, a WRF-derived forecast model being run at ESRL, called the High-Resolution Rapid Refresh model (HRRR), is having some dramatic forecast successes, ESRL's MacDonald says. On 29 June, the 3-kilometer-resolution model made a forecast over the lower 48 states, starting with a small cluster of thunderstorms in northern Illinois. The 12-hour forecast put the much-intensified system at Washington, D.C., at about 10 p.m. that night with winds to 100 kilometers per hour. That's exactly how the Indiana storm cluster actually evolved into the “D.C. derecho,” the worst summer windstorm in the D.C. area in decades.

    Here's hoping.

    Feeding radar data into models can help forecast hurricane intensity, but monitoring all the possible tornado-generating storms in Tornado Alley would be a daunting task.

    CREDIT: HERB STEIN, CSWR

    Though MacDonald cautions that not every HRRR forecast is so successful, “for big, dangerous storms, we tend to do pretty well,” he says. Other successes of the model include an early forecast this summer for Hurricane Isaac to head to New Orleans, Louisiana, while most other models called for it to molest the Republican convention in Tampa, Florida. For Sandy, HRRR nailed the 130-kilometer-per-hour winds that blew water into New York Harbor 15 hours ahead.

    Forecasting bugaboos

    Despite all the recent advances, forecasters have hit a wall when the weather plays out rapidly and on small scales. The deployment in the 1990s of NWS Doppler radars capable of mapping out the spinning winds of supercells enabled forecasters to extend average tornado warning times from 3 minutes to 13 minutes. But the new technology left the tornado false-alarm rate—how often forecasters warned of a tornado that never showed up—stuck at an uncomfortably high 75%.

    The problem, says Joshua Wurman of the Center for Severe Weather Research in Boulder, is that “we don't have a good idea of what's going to make a tornado. Seventy-five to 80% of supercells don't make a tornado. Some of the meanest-looking ones don't make them. What's special about the 20% that do? We know there must be some subtle differences between supercells, but they have eluded us.” And even the highest-resolution models are giving few clues to the secrets of tornadogenesis.

    Hurricane forecasters trying to predict storm intensity are in much the same boat. The error in forecasting a storm's maximum sustained wind speed a few days in advance has not changed much since the NHC record began in 1990. Forecasts are particularly bad when a hurricane rapidly intensifies or weakens. In 2004, “Hurricane Charley went from a Category 2 storm to Category 4 overnight, and nobody knows why,” says tropical meteorologist Peter Webster of the Georgia Institute of Technology in Atlanta. “It's a mystery what determines intensity. Perhaps we still don't understand the basic physics of a tropical cyclone.”

    Progress, or not.

    The U.S. National Weather Service has increased its tornado warning time (lead time, red), thanks to Doppler radar. But the false alarm ratio—how often forecasters warned of a tornado that never appeared—hasn't budged in 20 years.

    CREDIT: J. WURMAN ET AL., BULL. AMER. METEOR. SOC., 93 (AUGUST 2012)

    Help is on the way, however, at least for those predicting hurricanes. Because the mystery seems to lie at or near a tropical cyclone's eyewall, which is only a few kilometers thick, researchers have been figuring out how to usefully assimilate the detailed three-dimensional observations of airborne Doppler radar into hurricane forecast models. Meteorologist Fuqing Zhang of Pennsylvania State University, University Park, and colleagues have been doing just that using their own WRF-based forecasting system run at resolutions of 1 kilometer to 5 kilometers.

    Radar input has improved the accuracy of these intensity forecasts by 30% to 40% during the past 5 years, Zhang says. In the case of Sandy, ECMWF's modeling advantages—higher resolution among them—yielded suggestions of a strong storm 8 days ahead.

    Onward and upward, if …

    “The computer continues to be the big issue” in most types of weather forecasting, says James Hoke, director of NWS's Hydrometeorological Prediction Center in College Park. Researchers have long complained that they need more computing power. “It's frustrating for all of us, not being able to implement what we know,” Hoke says. But “as the computer power becomes available, the current trend of improvement will hold out for at least a decade.”

    That is, if forecasters can afford to keep adding the needed computing power. But MacDonald sees a way ahead: so-called massively parallel, fine-grain computers such as Intel's MIC computer or NVIDIA's graphics processing units. GPUs are the specialized electronic circuits found in game consoles and other devices that require processing large blocks of data in parallel. MacDonald sees GPU-based forecast computers running at petaflop speeds at a fifth the cost of conventional central processing units.

    If an ongoing federal interagency program can accelerate the adoption of GPUs, MacDonald says, NWS could be producing “transformational” forecasts by fiscal year 2017. With more computer power, the assimilation of radar observations, and more physically realistic models, forecasters could be doing an even better job forecasting the next Sandy.

  5. Meteorology

    One Sandy Forecast a Bigger Winner Than Others

    1. Richard A. Kerr

    The global model of Hurricane Sandy from the European Centre for Medium-Range Weather Forecasts illustrated the advantages of having the greatest computing power available.

    A week-ahead hit.

    Multiple ECMWF track forecasts (colored lines) made 7.5 days ahead had Sandy hitting the U.S. Northeast (actual track in black).

    CREDIT: ECMWF

    As winds swept Hurricane Sandy out of the Caribbean into the Atlantic, forecasters pored over their computer outputs to gauge its threat to land. Dramatic improvements in computing power and weather observations have steadily improved those forecasts, saving countless lives (see p. 734). In fact, official forecasts from the U.S. National Hurricane Center (NHC) in Miami, Florida, for Hurricane Sandy were even better than usual.

    But among the computer forecast models that NHC forecasters consulted, the global model from the European Centre for Medium-Range Weather Forecasts (ECMWF) stood out, as it tends to do. The European edge in forecasting Sandy illustrates once again the advantages of having the greatest computing power available and a laserlike focus on a single sort of forecasting.

    The ECMWF advantage was particularly evident out in the medium range 3 to 10 days ahead. Model runs made 9 to 10 days before Sandy struck New Jersey south of New York City gave “signs … of a major storm hitting somewhere in the northeast United States,” says ECMWF's Tim Hewson. By 7 to 8 days ahead, ECMWF's model “had a reasonably good idea of what might happen,” he says. And even at 5 days out, while the ECMWF model still had Sandy making a last-minute left turn into the Northeast, most other models showed the storm bearing right and moving harmlessly out to sea.

    CREDIT: EARTH OBSERVATORY/NASA

    What the ECMWF model got right and the rest got wrong early on was an unusually complex meteorological situation. Hurricanes do not propel themselves; they drift along in the broad flows of weather systems. So forecasting where a hurricane will be in 5 days' time requires forecasting where all the highs and lows and jet streams will be.

    As Sandy drifted northward from the Caribbean, the track ahead was in dispute. Most models forecasted a typical fate for Sandy. The usual eastward-flowing jet would steer the storm east and north out over the Atlantic. But the ECMWF model predicted a very different situation. It had a low-pressure system moving well south over the continent while a ridge of high pressure formed over the Atlantic. With these features pushing the jet out of the picture, Sandy would continue northward until a low-pressure trough helped draw it westward across the coast. Most of the models eventually agreed on that scenario, but not until about 4 days out.

    Pretty good.

    Six days ahead of landfall, the ECMWF model's forecast put Sandy (red, in center of pressure isobars) off the New Jersey coast (left), where eventually it hit (right, the actual storm).

    CREDIT: ECMWF

    Hewson is confident about ECMWF's performance: “Our model is statistically better, a leader around the world.” The reasons for that leadership—which ECMWF has held since it began forecasting in 1979—are twofold. For one, “it's a very focused organization,” Hewson says. Long-range forecasting around the world is ECMWF's only business. Its analysts don't spend energy worrying about that thunderstorm spinning off tornadoes in the next county.

    In addition, “you need a really good supercomputer,” Hewson says. Having more computer power available for long-range forecasting than anyone else gives ECMWF's model a sharper and more detailed picture of the weather, which makes for a more realistic forecast. And the ECMWF model is run more times than other models to produce a single forecast, a practice that helps reduce the uncertainties in forecasting an inherently chaotic atmosphere.

    But the ECMWF model is not always the best. In June, it had Tropical Storm Debby heading northwest into Texas. The U. S. Global Forecast System (GFS) got it right, predicting that the storm would move off to the east to hit Florida. Such fallibility is why, for U.S. forecasters, “there's no single model we're always going to go with,” says Joseph Sienkiewicz of the National Centers for Environmental Prediction in College Park, Maryland. The center develops and operates forecast models within the National Weather Service, including the NHC.

    For Sandy, NHC forecasters combined many runs from many models, including the GFS and the ECMWF model, to create a consensus model forecast. Then they made an official forecast, taking into consideration the foibles of the various models. As Sienkiewicz notes, “forecasters still do make forecasts.” And the ones they made at the NHC in the several days before Sandy's landfall served well indeed. “As the time got closer, we were able to home in on a solution,” Sienkiewicz says. “It's quite a success for the science.”

  6. Science in Europe

    Poor but Smart

    1. Kai Kupferschmidt

    It may not be an economic success. But the new, reunited Berlin has become one of Europe's hottest cities for science.

    The place to be.

    Berlin has developed a reputation as a cool, edgy city.

    CREDIT: ISTOCKPHOTO.COM/ANDREA ZANCHI

    BERLIN—When Detlev Ganten first arrived here a year after Germany's reunification, Berlin was a giant construction site struggling to reconnect two halves. It was not exactly inviting. He and his wife moved to East Berlin to be close to Ganten's job at the newly founded Max Delbrück Center for Molecular Medicine (MDC), but taxi drivers didn't know how to find the little-known institute. Their neighborhood, which lacked supermarkets, restaurants, and even street lamps, made his wife cry. Although Germans, they felt like visitors in a foreign country.

    Twenty years later, the neighborhood has undergone a transformation, and MDC is not only known to local taxi drivers but has also earned a prominent place on the world's scientific map. A 2010 Thomson Reuters ranking put it at number 14 in terms of output in molecular biology and genetics—“ahead of Stanford,” as Ganten proudly proclaims.

    That number tells a story—both about the institute and the ascent of Berlin as a science city. Since the Berlin Wall came down 23 years ago this week, the German capital has failed to live up to its economic expectations, but it has blossomed as a scientific hub, especially in the life sciences and mathematics. The city is now home to three respected universities and a host of nonuniversity research institutes, including MDC, one of three Helmholtz research centers residing in Berlin and neighboring Potsdam.

    In May, the triannual Funding Atlas from the German Research Foundation (DFG), which covered the years 2008 to 2010, showed that Berlin had overtaken Munich, its chief rival, as the region receiving the largest amount of third-party funding in Germany. A month later, Humboldt University became the second university in Berlin to earn the title “elite university” in Germany's Excellence Initiative. (Munich is the only other city that won this accolade twice.) Later this year, a merger of MDC with the Charité, one of Europe's largest university clinics, could further boost Berlin's standing as a powerhouse for translational research.

    Many institutes here now vie for top talent from around the world—and Berlin's new reputation as a hip, happening place is helping to lure them. “Poor but sexy,” a term coined by Berlin Mayor Klaus Wowereit, has become Berlin's unofficial motto—but today, it might as well be “poor but smart.”

    Cold War fault line

    Berlin has a long and glorious scientific tradition; this was, after all, the place where Robert Koch discovered the bacteria causing tuberculosis and cholera, Albert Einstein completed his theory of general relativity, and Max Planck laid the foundation of quantum theory. But National Socialism and World War II robbed the city of many of its greatest minds—and Berlin could not recover during its 44-year postwar division.

    Günter Stock, the head of the Berlin-Brandenburg Academy of Sciences and Humanities, moved from Heidelberg to West Berlin in 1983 to work for pharma company Schering as a physiologist. “Berlin was not an attractive place,” he recalls. West Berlin was a walled-in enclave, poised on a Cold War fault line, where cronyism thrived. “Professorships were routinely given to assistants of other professors,” Stock says. It was also lacking in quality of life; to lure residents, the government gave people working in West Berlin a salary bonus. “There was a real subsidization mentality,” Stock says.

    In East Berlin, meanwhile, leading scientific posts were filled according to party loyalty rather than excellence, and scientific methods and equipment were often outdated. When Karl-Max Einhäupl, head of the Charité, first came to the clinic in 1993, the scene still reminded him of the set for a postwar period movie: “You wouldn't have had to add anything. And you wouldn't have had to take anything away.”

    Almost 2 decades later, Berlin is still struggling in many ways. The many new businesses expected in the wake of reunification never showed up, and the city's 11.8% unemployment rate is twice the national average. Commuters are exasperated by faulty brakes, jammed doors, and other problems on the city's underground, and in May, the opening of the new international airport, a multibillion-dollar prestige project, had to be pushed back by a year and a half because of planning failures and mismanagement.

    The fact that science is doing so much better, most scientists agree, is based at least partly on some bold choices made after the Wall came down. While many organizations and agencies in East Germany were shut down or taken over by their Western counterparts, Berlin's science senator, Manfred Erhardt, insisted on keeping the Charité, Humboldt University, and other East German science institutions alive and independent. To be sure, they had to change: Three institutes of the East German Academy of Sciences were united to form MDC, for instance, and hundreds of university professors were suspended. But just as gene duplications allow a genome to take new evolutionary paths, the preservation of parallel structures gave Berlin science more opportunities to evolve.

    The decision to base the new federal government in Berlin added scientific clout. Many scientific organizations moved their administrative offices here and started new labs or institutes. Five new Max Planck institutes opened in the Berlin area, bringing the total to eight, and the city hosts 13 Leibniz institutes. Several new government institutes opened up as well, including the Federal Institute for Risk Assessment, founded in 2002.

    Perhaps most importantly, scientists wanted to come to Berlin. Really interesting scientists want an interesting environment, says Nikolaus Rajewsky, a German-born mathematical biologist who left his job at New York University in New York City to join MDC in 2006—and Berlin is such an environment. The city seems permanently in transition, even if no one is quite sure where that is leading. “Whenever I was in Berlin I had this feeling of a huge delta,” Rajewsky says. “The city was changing, something was happening here.”

    Today, Berlin has also developed a reputation as a cool, edgy place with world-class museums, good clubs, and a huge cultural offering. It boasts three opera houses and one of the world's best symphony orchestras, artists and designers have moved into abandoned factories and warehouses, and the city has become a fertile breeding ground for Internet start-up companies. And Berlin is quite cheap. It's still possible to rent a nice three-room flat in a desirable neighborhood here for less than €800 a month—a fraction of rents in London, Paris, or San Francisco. The creative cuisine and countless bars don't break the bank either.

    Competition and collaboration

    Back in the Heimat.

    Klaus Rajewsky and his son Nikolaus left jobs in Boston and New York, respectively, to come to Berlin.

    CREDITS: DAVID AUSSERHOFER/THE MAX DELBRÜCK CENTER FOR MOLECULAR MEDICINE

    Rajewsky's father, Klaus, is another prominent addition to Berlin's science scene—and a sign of larger shifts in Germany's academic world. In 2001, after 40 years of research at the University of Cologne, the noted immunologist left for Harvard Medical School in Boston, because at age 65 he was threatened with forced retirement. In 2010, he returned to his home country, where the rules had changed, and he now has an open-ended contract at MDC. “That this was possible only 10 years after I had to leave the country really shows you how much things have changed,” he says.

    That his son had relocated to Berlin helped nudge Klaus Rajewsky back, but he says the funding situation in the United States had also gotten worse. “Mouse genetics is expensive research, and it became more and more difficult to get grants,” he says. “In Germany on the other hand, you get institutional support that U.S. researchers can only dream of.” Molecular biologist Matthew Poy echoes that sentiment. The junior group leader, born and raised in the United States, came to MDC in 2008 after postdoc positions in New York and Zurich, Switzerland. His most important reason: The position came with guaranteed research funds. “The secret is out that Germany has a lot of money dedicated to science,” Poy says.

    Indeed, Germany's deep coffers allowed Nikolaus Rajewsky to realize his dream: setting up a brand-new center, the Berlin Institute for Medical Systems Biology, which aims to study gene regulation “all the way from chromatin to protein.” The nascent institute is part of MDC, but it will get a new €40 million building, currently under construction in the city center. Rajewsky has already hired researchers from Columbia, Duke, and Harvard universities.

    That said, Berlin still isn't Germany's Boston—“that comparison is ridiculous,” Klaus Rajewsky says. For example, it's harder to get good postdocs here, he says, and the work ethic is more relaxed. “In Boston, science is the main thing in life for most people and they work accordingly,” Rajewsky says.

    Berlin is also struggling to attract biotech and pharma companies, says chemist Peter Seeberger, who came to Berlin in 2009 after working as a professor at the Massachusetts Institute of Technology in Cambridge and ETH Zurich. “In Boston you can go to university, then cross the road and start working in a company,” Seeberger says. “Berlin needs that to play in the first league internationally.” And while nonuniversity institutes such as Max Planck can offer high salaries, Berlin's universities have missed out on some top talent because they couldn't match other universities' offers, he adds.

    Healing the past

    It's not just the life sciences that are booming. Mathematics has become another hot field. In 2002, mathematicians from the three Berlin universities and two institutes got together to form the research center Matheon, funded by DFG to do applications-driven research. It has made Berlin one of Germany's two top math cities, says Günter Ziegler, a scientist working at Matheon. (The other is Bonn, the former capital.) Further proof of that, he says, is the fact that the International Mathematical Union—the body that awards the Fields Medal—set up a permanent home here in 2011.

    More developments are afoot. Negotiations are under way to merge MDC with the university clinic Charité. That's important because by law, German universities are paid by the states; they cannot be funded directly by the federal government. Helmholtz institutes like MDC, on the other hand, get 90% of their money from the federal government. Binding the two together in an entity would allow the federal government to give extra funding for medical research at the Charité—perhaps as much as €50 million to €80 million annually, insiders say.

    To Stock, it would be the most important event in years for Berlin's rise as a science capital. And Ganten hopes that in the long run, it will allow Berlin and the Charité to recapture some of its past glory. “Berlin has such a rich tradition,” he says. “Einstein, Planck, Heisenberg. Koch, Ehrlich, Virchow. These names still mean something today.” Now that the wounds of a divided city have healed, Berlin seems set to retake its place in Europe's scientific history.

Log in to view full text