# News this Week

Science  17 Nov 2006:
Vol. 314, Issue 5802, pp. 1060
1. ELECTION 2006

# Science Awaits Impact of Democratic Sweep in Congress

1. Jeffrey Mervis

Science policy lobbyists like to say that strengthening the U.S. research enterprise isn't a partisan issue. That theory will be put to the test starting in January—and perhaps even sooner—when the research community tries to cash in on last week's Democratic capture of both the Senate and the House of Representatives without sacrificing expected legislative gains under the current Republican leadership.

Specific areas may benefit: Calls for relaxing constraints on embryonic stem cell research and greater environmental stewardship may have helped propel some Democrats to victory and raised hopes for action in the upcoming 110th Congress (see pages 1061, 1062). But on the overall direction of government spending on science, there's less difference between the two parties than on many issues. Both support a 2005 report from the National Academies on how to improve U.S. competitiveness—including doubling the budgets of some science agencies—for example, although they disagree on which recommendations to emphasize and how quickly to proceed. Even so, legislation to implement many of the report's suggestions has been stalled, and many lobbyists are saving their powder for the new regime.

“I don't think there's any broad message for science in the election,” says Representative Vernon Ehlers (R-MI), a 13-year veteran who had hopes of chairing the House Science Committee had the Republicans remained in power. “Science continues to be largely bipartisan.” Both Ehlers and Representative Rush Holt (D-NJ), who jokingly call themselves a two-person congressional physics caucus because of their Ph.D.s in the field, expect Democrats to push ahead next year with their own bills to improve U.S. competitiveness that contain major increases for research, education, and training, and clean-energy technologies. But if and when those authorization bills pass, it may be hard to find money to implement them.

Indeed, the stage for budget battles next year could be set in the next few weeks. That's when the lame-duck Republican Congress considers appropriations bills containing hefty spending increases for several science agencies. Science lobbyists fear that some of those bills, covering the 2007 fiscal year that began 1 October and based largely on requests from President George W. Bush, could be severely trimmed to meet another goal that both parties swear allegiance to: reducing next year's expected budget deficit of $335 billion. Although most observers are still hoping Congress will approve spending bills based on agency-by-agency negotiations, another path would be to hold every agency to 2006 funding levels under what's called a continuing resolution (CR). “A CR is the worst-case scenario,” Arden Bement, director of the National Science Foundation (NSF), told a group of advisers earlier this month. “I don't want to think bad thoughts like that.” An even bigger budget wild card is the direction of the war in Iraq. The most obvious change next year will be a new lineup of committee chairs. In the Senate, that will mean a roster of familiar Democratic faces setting the scientific agenda, including Daniel Inouye of Hawaii at Commerce, Science, and Transportation; Massachusetts's Edward “Ted” Kennedy at Health, Education, and Labor; and New Mexico's Jeff Bingaman at Energy and Natural Resources. The likely new heads of research-rich Senate appropriations panels include Maryland's Barbara Mikulski (NSF, NASA, and the National Oceanic and Atmospheric Administration) and Iowa's Tom Harkin (the National Institutes of Health). All have seen their party's fortunes wax and wane and have a history of working closely with their Republican counterparts. (Only one major committee in either body will be headed by a woman: California Senator Barbara Boxer at Environment and Public Works.) In the House, the Democratic majority will mean a return to power of well-known figures such as Michigan's John Dingell at the helm of the Energy and Commerce Committee and California's Henry Waxman at Government Reform. California's George Miller will lead the education and workforce panel, which could be busy reauthorizing programs for both elementary and secondary school students and for the nation's system of higher education. One relative newcomer will be Tennessee's Bart Gordon, in line to chair the House Science Committee (see page 1061). The heads of the science-relevant House spending panels won't be clear for several weeks. 2. ELECTION 2006 # Environmentalists See a Greener Congress 1. Erik Stokstad The next Congress will shift its environmental policymaking from reverse to forward, say environmental advocates celebrating last week's election results. Two major reasons for that new direction are the defeat of a powerful House member who, critics say, was bent on weakening the Endangered Species Act (ESA), and the replacement of an influential Senate chair, who infamously called global warming a hoax, with a longtime proponent of cutting emissions of greenhouse gases. “The mood is one of excitement and anticipation,” says Melissa Carey of Environmental Defense. “We haven't had a better opportunity to do something about climate change in years.” The enthusiasm is tempered: Democrats are not united on the issue, have a slim majority, and face an Administration that adamantly opposes controls on emissions. Meanwhile, President George W. Bush last week asked the lame-duck Congress to pass an energy bill, fighting words for Democrats trying to block a House version that would open up much of the U.S. coastline to drilling. The biggest news in the House was the defeat of Representative Richard Pombo (R-CA). As chair of the Resources Committee, Pombo last year won House passage of his major revision of ESA (Science, 7 October 2005, p. 32). The bill has since stalled in the Senate. Environmental groups contributed more than$2 million to the campaign of Jerry McNerney, a wind-power engineer, who defeated Pombo 53% to 47%, ending the attempt to rewrite the ESA.

Now environmentalists are anticipating more friendly treatment. Representative Nick Rahall (D-WV), the likely new chair, wants to reform a mining law that has led to problems with contaminated tailings, protect roadless areas in national forests, and end subsidies for offshore oil exploration. Rahall also plans to examine claims that a political appointee at the Department of the Interior distorted scientific findings to prevent the listing of endangered species.

In the Senate, California's Barbara Boxer is expected to take the helm of Environment and Public Works from Senator James Inhofe (R-OK), a bête noire of the climate change community. Her priorities include legislation similar to her home state's that would cap and eventually reduce emissions of greenhouse gases. House Speaker-designate Nancy Pelosi is like-minded; she co-sponsored a stalled bill proposed by Representative Henry Waxman (D-CA) that would cap emissions in 2010 and then reduce them to 1990 levels over the next decade.

Such a bill would likely face resistance from Representative John Dingell (D-MI), who's slated to take over the House Energy and Commerce Committee. Dingell said last week that he would “support responsible legislation” and plans to hold hearings, but he told Greenwire that Waxman's bill is “extreme.” Although some advocates complain that there's already been too much talk—239 hearings on climate change, by one count—others say that the shift in power has turned the debate from whether action is necessary to how much and when.

3. ELECTION 2006

# Gordon Steps Up to House Science Post

1. Jeffrey Mervis

Representative Bart Gordon (D-TN) is known as the fastest man in Congress for his stellar performances each year in a 5K race that pits federal officials against the members of the media who cover them. Starting in January, however, the 57-year-old lawyer expects to be leading a slower-moving pack: the House Science Committee.

Although the science committee is little known outside the research and academic communities, Gordon says that he asked to be on it as a freshman and that “it was my hope all along” to become its chair some day. As the highest-ranking Democrat on the committee since 2003, he's all but guaranteed the job in the 110th Congress.

First elected in 1984 after holding Democratic Party posts in Tennessee, Gordon has been returned 11 times, mostly by comfortable margins. He succeeded Al Gore, whose election to the Senate that year launched a national career that would take him within a hanging chad of the White House. Gordon, who still lives in his hometown of Murfreesboro, holds no such grand political ambitions, say those who have followed his career. But he still wants to make a difference. “He's a totally local politician,” says Jeff Vincent, the Washington, D.C.-based head of federal relations for Vanderbilt University in neighboring Nashville. “I think this is really an opportunity for him to play a larger role.”

As chair of the committee's space panel in the early 1990s, Gordon developed an interest in space-related issues that is likely to translate into closer scrutiny of the Bush Administration's proposed moon-Mars exploration program and its impact on space science. “I think that both are underfunded,” he says, “but I think we need to know more before we can move ahead.”

His supportive but questioning attitude toward NASA mirrors the view of the outgoing chair, retiring moderate New York Republican Sherwood “Sherry” Boehlert. In fact, the two men see eye to eye on most issues before the committee—notably, additional funding for science education at the National Science Foundation (NSF), criticism of the Administration's attempts to muzzle federal scientists on sensitive topics such as climate change, and doubling federal spending for research in the physical sciences. “I can't think of a better relationship between a chair and a ranking [minority] member than between Bart and myself,” says Boehlert.

Even so, that bipartisanship may be put to the test in the next Congress. Gordon is eager to set up an entity within the Department of Energy (DOE) modeled after the Defense Advanced Research Projects Agency. Although the idea comes from an acclaimed 2005 National Academies report on strengthening U.S. science that the Administration has embraced, President George W. Bush pointedly omitted any new DOE agency from the competitiveness plan he submitted to Congress earlier this year. Gordon's desire to give NSF a bigger role in science education may also irritate the White House, which wsants the Education Department in the driver's seat. And Gordon's promise to hold hearings “to give scientists a chance to tell their side of the story” about whether the Bush Administration has undermined scientific integrity is sure to draw fire from Republican colleagues.

4. ELECTION 2006

# Stem Cell Supporters Hail Results, But Political Lessons Aren't Clear

1. Eli Kintisch

On 7 November, voters in several states backed candidates supporting expanded research with embryonic stem cells. That much is clear. But the impact of those victories on federal policy that restricts the use of stem cells is much harder to discern. And experience in at least one state suggests that injecting stem cell issues into a political campaign can backfire.

“Republican candidates aren't going to want this as an issue in 2008,” asserts Sean Tipton of the Coalition for the Advancement of Medical Research in Washington, D.C. He says the election results bolster the hopes of those seeking to overcome President George W. Bush's opposition to allowing research on embryonic stem cell lines created after August 2001. (Bush had vetoed such a bill, H.R. 810, last summer, and supporters were unable to override it.) But opponents of embryonic stem cell research take heart from the fact that they almost defeated a proposed constitutional amendment in Missouri that would bar lawmakers from outlawing the research while banning reproductive cloning; as recently as September, the proposal enjoyed a 20-point lead.

The Missouri vote has reinforced one tenet of faith among supporters: Don't make stem cell research a partisan issue. Despite her personal support for Amendment 2, Senate Democratic challenger Claire McCaskill had avoided the topic during campaign appearances out of concern about offending rural, pro-life supporters. But in the waning days of her race against incumbent Republican Jim Talent, McCaskill aired a television advertisement featuring movie star Michael J. Fox, visibly afflicted by Parkinson's disease. Fox accused Talent of voting to “criminalize the science.” The ad did not mention the amendment, but it turned out to be a disaster for the amendment's supporters. National conservat ive icon Rush Limbaugh complained that McCaskill was trying to “mislead voters,” and Fox News host Bill O'Reilly attacked philanthropists and cancer survivors Jim and Virginia Stowers of Kansas City for standing “to make billions” off various research institutions they have set up if the amendment passed—a charge that Stowers Institute President William Neaves called “outrageous.”

“This became the center of the culture war universe,” says Bob Deis, a political consultant to amendment backers. Internal polling showed Republican support for the amendment plummeting “eight to 10 points” in a week, says Deis. In the end, Amendment 2 passed by only 50,000 votes among 2 million cast. (It's not clear whether the ad had any effect on the Senate race itself, which concentrated on the Iraq war and health care. McCaskill won narrowly after trailing Talent for much of the campaign.)

In states awash in a stronger Democratic tide, some candidates did effectively leverage local scientific and commercial interest in the research. Incumbent Wisconsin Governor Jim Doyle, a Democrat, vetoed a bill in 2005 that would have criminalized somatic cell nuclear transfer—popularly known as research cloning—a potential method of obtaining embryonic stem cells genetically matched to a patient. After a poll showed that 69% of Wisconsin voters approved of the research, Doyle ran harder than any other U.S. candidate on the issue against an opponent—Republican Representative Mark Green—who opposed the method. In a series of press conferences and TV ads, flanked by patients and entrepreneurs, Doyle touted the proposed $375 million Institutes for Discovery and efforts to recruit stem cell experts to the state. Doyle defeated Green by 53% to 45%. In Maryland, Democratic Representative Ben Cardin also effectively trumpeted his support for embryonic stem cells in TV ads featuring Fox. The ads claimed that Cardin's opponent, Republican Lieutenant Governor Michael Steele, shared Bush's opposition to the research. When Steele's sister proclaimed in an ad that her brother “does support stem cell research,” three stem cell scientists at Johns Hopkins University in Baltimore, Maryland, held a press conference to clarify that Steele only supported work with adult cell lines. Cardin won by 54% to 44%. Pro-stem cell lobbyists say Bush's growing unpopularity gives some Republicans a chance to vote their consciences, with less fear of political repercussions, if a stem cell bill comes up in Congress in the next 2 years. Representative Heather Wilson (R-NM), who was narrowly reelected last week in a campaign that focused on the Iraq war, explained in a TV ad that she voted to override the veto because it “was the right thing to do.” Tipton says the new Democratic majority in both houses also gives proponents a chance to apply new tactics, including connecting stem cells to hard-to-veto bills, or pairing it with other legislation that appeals to pro-life lawmakers or the White House. But David Prentice of the Family Research Council in Washington, D.C., which opposes any change in current federal policy, sees no “sea change” on the issue and predicts that supporters won't find it easy to overcome another Bush veto. And although most of the country's attention is shifting to Washington, Missouri may still bear watching. There's already talk among Missouri's pro-life community about crafting a new ballot initiative that would repeal Amendment 2. 5. ELECTION 2006 # Scientists Get Out the Word 1. Yudhijit Bhattacharjee U.S. scientists hardly play any organized role in influencing elections. But two new groups are claiming some credit for the outcome of a few races last week and say they plan to be more active in 2008. Scientists and Engineers for America (SEA), founded in September by Nobelist Peter Agre of Duke University in Durham, North Carolina, and others, visited a handful of college campuses to support candidates favoring embryonic stem cell research, the teaching of evolution, and policies to stem global warming. The 6500-member group, which raised$95,000, also ran a few Internet banner ads and posted information on its site (http://www.sefora.org/) to help voters see the track records of different congressional candidates on key scientific issues. Senate Democratic candidates favored by SEA won in Missouri, Maryland, and Virginia.

In Ohio, a group calling itself Help Ohio Public Education (HOPE) persuaded former U.S. representative and Akron mayor Thomas Sawyer to run in a state school board race against Deborah Owens Fink, a supporter of intelligent design. “The idea behind HOPE was in part to do what the creationists have been doing: recruiting candidates and then helping them get elected,” says physicist Lawrence Krauss of Case Western Reserve University (CWRU) in Cleveland, who organized the group. Krauss also collected signatures from nearly 90% of CWRU's science faculty in support of Sawyer and four other pro-science school board candidates. “If the enemies of science can do that, why can't scientists?” he says.

Although HOPE did not raise and spend any money, it invited Brown University biologist Kenneth Miller to give public lectures about why Ohio voters needed to keep religion out of the science classroom. Sawyer trounced Owens Fink by a two-to-one margin, and three of the other four candidates endorsed by HOPE won.

Both groups plan to continue their work. SEA hopes to establish student chapters at universities and allow members to post information about where politicians stand on science. “What this election told us is that issues of science do connect with the public,” says Susan Wood, former director of the Office of Women's Health at the U.S. Food and Drug Administration and an SEA founder. “Voters are becoming increasingly aware that competent governance requires making policies based on good science.”

6. ELECTION 2006

# Elsewhere on the Election Front

Scientifically inclined. Wisconsin Democrat Steven Kagen, who won an open House seat, is an assistant clinical professor of allergy and immunology at the Medical College of Wisconsin in Milwaukee. The physician owns four allergy clinics and also maintains a lab that has published molecular analyses of several environmental allergens.

Kansas Democrat Nancy Boyda, who defeated five-term Representative Jim Ryun, worked as a field inspector and analytical chemist for the Environmental Protection Agency and held management positions at pharmaceutical companies. She holds an under graduate degree in chemistry and education and has taught middle-school chemistry.

Political powerhouse. Tiny Cornell College in Mount Vernon, Iowa, can lay claim to two incoming Democratic House members: political science professor David Loebsack, who toppled 15-term incumbent Jim Leach, and Chris Carney, who graduated in 1981 with degrees in environmental science and diplomatic history and now teaches political science at Pennsylvania State University, Worthington-Scranton.

Raising his voice. New York Democrat John Hall, who beat Representative Susan Kelly, studied physics at the University of Notre Dame in Indiana and Loyola College in Baltimore, Maryland, before dropping out to become a rock musician. A member of the popular band Orleans in the 1970s, Hall led efforts to fight nuclear power plants before turning to politics.

2008 is really open. For the first time since 1928, neither the incumbent president nor vice president will be running for president in 2008.

7. GLOBAL CLIMATE CHANGE

# False Alarm: Atlantic Conveyor Belt Hasn't Slowed Down After All

1. Richard A. Kerr

A closer look at the Atlantic Ocean's currents has confirmed what many oceanographers suspected all along: There's no sign that the ocean's heat-laden “conveyor” is slowing. The lag reported late last year was a mere flicker in a system prone to natural slowdowns and speedups. Furthermore, researchers are finding that even if global warming were slowing the conveyor and reducing the supply of warmth to high latitudes, it would be decades before the change would be noticeable above the noise.

The full realization of the Atlantic's capriciousness comes with the first continuous monitoring of the ocean's north-south flows. In March 2004, researchers of the Rapid Climate Change (RAPID) program moored 19 buoyant, instrument-laden cables along 26.5°N from West Africa to the Bahamas. A few months later, they steamed along the same latitude, lowering instruments periodically to take an instantaneous “snapshot” of north-south flows. While waiting for the moored array to produce long-term observations, physical oceanographer Harry Bryden and his team at the National Oceanography Centre in Southampton, U.K., compared the 2004 snapshot with four earlier instantaneous surveys dating back to 1957. They found a 30% decline in the northward flow of the conveyor (Science, 2 December 2005, p. 1403), sparking headlines warning of Europe's coming ice age.

The first year of RAPID array observations has now been analyzed, and the next European ice age looks to be a ways off. At a RAPID conference late last month in Birmingham, U.K., Bryden reported on the first continuous gauging of conveyor flow. Variations up and down within 1 year are as large as the changes seen from one snapshot to the next during the past few decades, he found. “He observed a lot of variability,” says oceanographer Martin Visbeck of the Leibniz Institute of Marine Science at the University of Kiel in Germany, who attended the meeting; so much variability that “more than 95% of the scientists at the workshop concluded that we have not seen any significant change of the Atlantic circulation to date,” wrote Visbeck in a letter to the British newspaper the Guardian.

Although the immediate threat has evaporated, a difficult challenge has taken its place. “Scientific honesty would require records for decades” in order to pick out a greenhouse-induced slowing, says physical oceanographer Carl Wunsch of the Massachusetts Institute of Technology in Cambridge. “How do you go about doing science when you need decades of record?” For their part, RAPID researchers will be asking for funding to extend array operations to a decade, says Bryden. Then some combination of government agencies would have to take on the burden of decades of watchful waiting.

8. CHINESE DRUG RESEARCH

# Novartis Invests $100 Million in Shanghai 1. Richard Stone, 2. Hao Xin SHANGHAI—Big pharma companies and China may not love each other for the same reasons, but relationships are blossoming. Companies are enamored of the low operating costs and the large market potential in China, whereas local officials are aflutter over foreign investment and know-how. So far, however, few big companies have moved their R&D efforts to Chinese soil (Science, 29 July 2005, p. 735). Many are content with long-distance relationships, outsourcing specific steps in the drug discovery process. But Novartis, the Swiss drug giant, is making a serious commitment. Last week, Novartis unveiled plans to build a$100 million R&D center in Shanghai, a fast-growing hub of biomedical excellence. The company intends to hire some 400 mainly local scientists to focus initially on infectious causes of cancer such as hepatitis B virus, linked to a high rate of liver cancer in China. The first of two facilities is slated to open next spring. The R&D center “will encompass all stages of drug development, from early discovery all the way to clinical trials,” says Novartis spokesperson Jeffrey Lockwood.

Pharmaceutical scientists in Shanghai welcome the venture. “It's a really good thing,” says Zhuohan Hu, president of the Research Institute for Liver Diseases, a company that is negotiating an alliance with Pfizer. Hu and others predict that it will not be easy for Novartis to assemble and train such a large scientific workforce.

But for Novartis, China is not virgin territory. It set up an office in Beijing in 1997 and has R&D alliances with WuXi PharmaTech and the Shanghai Institute of Materia Medica, among others. Novartis manufactures one product in China that it developed with Chinese partners: Coartem, an antimalaria drug derived from wormwood based on traditional Chinese medicine.

In the past, companies have often formed task-specific partnerships to reduce the risk of renegade employees running off with a hot discovery. Companies such as Merck and AstraZeneca, for example, have contracted out specific jobs to different Chinese organizations. In 2002, Novo Nordisk was the first to establish a research facility in China. It set up a small R&D shop near Beijing; Roche followed in Shanghai in 2004. Novartis, however, would have by far the biggest research investment. Lockwood downplays the risk of Novartis findings being spirited out the back door. “We see the trend improving toward more rigorous intellectual-property protection,” he says.

Novartis has tapped En Li to be research director of the center, which will be down the road from Roche in Shanghai's Zhangjiang High-Tech Park. Li, a Shanghai native, joined Novartis in 2003. He's currently a research chief at the Novartis Institutes for BioMedical Research in Cambridge, Massachusetts.

One of Li's initial challenges is to find the right mix of scientists. Although China is teeming with skilled chemists, Hu contends, “it's not that easy to find good hands-on biologists here.” Lockwood is bullish. “We believe there is a growing talent pool in China,” he says. “We also hope that the center will be a magnet for [returning Chinese scientists] as well.” And Novartis won't be hiring all 400 scientists in one go: Its first Shanghai lab, expected to open in May 2007, will employ about 160 researchers. Construction on a second facility is planned to begin next summer.

9. PHYSICS

# Electronic Nuisance Changes Its Ways

1. Robert F. Service

In modern electronics, as in James Bond movies, it's the good guys versus the bad guys. The good guys are electrons, packets of electrical charge that devices such as diodes and transistors start, stop, and steer to orchestrate a dance of 1's and 0's. The bad guys: vibrations called phonons that splay heat every which way and can ultimately wreak havoc on a computer chip. But now researchers at the University of California, Berkeley, may turn some unruly phonons into allies.

On page 1121, researchers led by physicist Alex Zettl and mechanical engineer Arunava Majumdar report the first-ever set of simple devices, akin to diodes, that steer a small excess of phonons in one direction. “It's a cool result,” says James Heath, a chemist and nanoelectronics expert at the California Institute of Technology in Pasadena. If the effect can be improved, it could lead to a novel form of computation based on phonons and to heat-steering materials that make buildings more energy-efficient, among other things.

The new work marks the latest example of the unique capability of nanostructures to display odd quantum-mechanical properties. Nanotechnologists have shown that materials with at least one dimension smaller than 100 billionths of a meter can have odd optical, electrical, and catalytic behaviors due to the way they confine electrical charges. More than 50 years ago, German-born British theoretical physicist Rudolf Peierls suggested that string-shaped one-dimensional (1D) systems could also channel heat-generating phonons in unusual ways. But researchers had never managed to demonstrate any such effect.

The Berkeley team started with tiny strawlike nanotubes, some made from carbon, others of an alloy of boron and nitrogen. In previous studies, Zettl's group and others had shown that both types of nanotubes are excellent heat conductors and that phonons move through them with equal efficiency in both directions. But Zettl's graduate student Chih-Wei Chang had been studying how phonons move through nanotubes and suspected there was an easy way to give them a push. Theoretical models suggested that a 1D system loaded with extra mass at one end would make it easier for phonons to travel from the high-mass end to the low-mass end.

To test the idea, the Berkeley researchers placed individual tubes inside a vacuum chamber and bonded the two ends to a pair of custom-designed electrodes that could serve as both heaters and heat sensors. Next, they sprayed a vaporized platinum compound, C9H16Pt, into the chamber and used a beam of electrons from a scanning electron microscope to weld molecules of the gas onto one end of their nanotubes. They then sent a power surge with a known amount of energy to the heater and tracked how much heat made it through the nanotube to the sensor. Finally, they repeated the experiment with the heater and sensor reversed.

In every case, more heat flowed toward the side of the nanotube with less mass, even though the excess C9H16Pt didn't span the two electrodes and thus couldn't carry the extra heat. Zettl suspects that standing waves called solitons that vibrate through the nanotubes could be responsible for increasing the heat-carrying efficiency in one direction, although more work needs to be done to confirm this.

For now, the effect is small. At most, only a 7% excess of phonons travels in the preferred direction. That may not be enough to create phonon-based computing devices or other applications, Heath says. But such applications “may exist if someone can figure out how to do this well,” he adds.

10. PUBLIC HEALTH

# SARS and Bird Flu Veteran to Take WHO Helm

1. Gretchen Vogel

Margaret Chan is no stranger to public health emergencies. The infectious-disease expert, who was elected on 9 November to be the next director-general of the World Health Organization (WHO), is best known for her role in containing two fast-spreading outbreaks—of bird flu and SARS—as Hong Kong's director of public health from 1994 to 2003. Largely on those merits, she was awarded the top slot for communicable diseases at WHO in 2005.

But Chan says that two broader problems will be her top concerns when she takes over leadership of WHO in January. “I want to be judged by the impact we have on the health of the people of Africa and the health of women,” she told the World Health Assembly just hours after being elected.

The sudden death in May of then-Director-General Lee Jong-wook led to a hard-fought race among an unprecedented 13 nominees (Science, 15 September, p. 1554). Most, including Chan, had slick Web sites and spent the last 3 months campaigning around the world. From the start, Chan was among the predicted favorites, and in the final ballot she received 24 votes; the runner-up, Mexican Health Minister Julio Frenk, received 10. She will be the first Chinese to head a major United Nations organization, and many observers hope her election will encourage China's government to take a more active role in tackling international health issues such as HIV/AIDS and bird flu.

Scientists who have worked with Chan to try to prevent a global flu pandemic immediately praised her selection. “She is a very strong leader, and translating science into policy is one of her strong points,” says Albert Osterhaus, a virologist at Erasmus University Medical Center in Rotterdam, the Netherlands. “In crisis situations, she knows how to handle things and how to maneuver through a political minefield.”

In 1997, when the first human cases of the H5N1 avian influenza strain were detected in Hong Kong, Chan quickly responded by ordering the culling of all 1.5 million poultry on the island, an aggressive move widely credited with preventing a broader outbreak. She received more mixed reviews for her handling of the 2003 SARS outbreak; some critics say she could have pushed harder to get information from mainland China, where the disease apparently originated.

In the past few weeks, global health officials have again accused China of withholding data—this time, on the spread of avian influenza (Science, 10 November, p. 905). Hours after her election, Chan moved to dispel fears that she might not be tough enough on her own government. As director-general, her loyalty belongs to all 193 member countries, she said at a press conference. If anything, she said, she will be uniquely placed to encourage more openness from Chinese officials.

International desire for more cooperation from China played a key role in the final vote between Chan and Frenk, several observers say. Richard Horton, editor of The Lancet, who before the election made no secret of his support for Frenk's candidacy, says the result was based on political calculations rather than personal differences between the candidates. “The vote … was as much a vote for China as it was for Margaret Chan,” he says.

Chan, 59, who was born in Hong Kong and lived there most of her life, studied medicine at the University of Western Ontario in Canada and public health at the National University of Singapore. In Hong Kong, she instituted a “diapers to grave” approach to public health, with a focus on preventative care and encouraging healthy lifestyles.

In explaining her priorities after her election, Chan said that the people of Africa “carry an enormous and disproportionate burden of ill health and premature death,” and raising their status therefore must be one of the key measures of WHO's performance. Women's health is another key indicator, she said.

She emphasized that improving women's health means addressing not only reproductive health issues but also indoor air pollution from cooking fires, multiple infectious diseases, and violence. Targeting such problems improves the health of entire families and communities, she argued.

Even so, Horton predicts, her priorities could bring her into conflict with the United States, which campaigned hard for her election behind the scenes. “She can't deal with [women's health] without contraception, abortion, and condoms. … It's going to take her into deeply political territory, and that's good. That's what we need WHO to do,” he says. “She has set out a clear agenda. It's a good agenda. Now we need to give her the benefit of the doubt.”

11. PALEOGENETICS

# The Dawn of Stone Age Genomics

1. Elizabeth Pennisi

DNA from a 38,000-year-old Neandertal is revitalizing the once-moribund field of ancient DNA, and it promises a fresh perspective on how we differ from our closest relatives

When German quarry workers chipped the first Neandertal bones out of a limestone cave in 1856, DNA analysis wasn't even a glimmer in any scientist's mind. Now, two reports, one on page 1113 and the other in the 16 November issue of Nature, describe the first successes in sequencing nuclear DNA from a Neandertal bone—a feat once considered impossible. The results from the two groups, working collaboratively but using different approaches, support the view that Neandertals are a separate branch of the hominid family tree that diverged from our own ancestors perhaps 450,000 years ago or more.

Because the extinct Neandertals are our closest relatives, comparing their DNA to ours may one day reveal the mutations that set Homo sapiens apart from all other species, as well as the timing of key evolutionary changes. But it's early days yet, and this week's papers chiefly suggest the potential of Neandertal genomics. They also fan the flames of the debate about how different Neandertals were from modern humans, and whether the two groups interbred during the thousands of years they coexisted in Eurasia (see sidebar, p. 1071). “This is great stuff,” says molecular evolutionist Alan Cooper of the University of Adelaide, Australia. “It opens the way for much more work on identifying uniquely human genetic changes.”

Coming on the heels of dramatic sequencing successes with ancient mammoth and cave bear DNA, the papers also herald a renaissance for a field that has been stymied by issues of poor sample quality and contamination. The Neandertal studies use metagenomics, which makes unnecessary the onerous task of purifying ancient DNA. They also employ faster, cheaper sequencing methods, and their achievement demonstrates the feasibility of deciphering ancient genetic material. “It has people talking about new ideas, new extraction techniques, new ways to prepare samples, new ways to think about old DNA,” says Beth Shapiro, an ancient DNA specialist at the University of Oxford in the U.K.

Both teams are planning major additional projects. In July, the team led by Svante Pääbo, a paleogeneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, announced that it plans to produce a very rough draft of the entire Neandertal genome in 2 years. With that draft, he and others will be better able to tell which of the 35 million bases that differ between chimp and humans are mutations that occurred in just the past 500,000 years and therefore likely define our species. “Perhaps we can find that last little bit that made us special,” says Pääbo.

Meanwhile, the other team, led by Edward Rubin, head of the Department of Energy Joint Genome Institute in Walnut Creek, California, has support from the U.S. National Institutes of Health to gather DNA from several Neandertal fossils to study specific regions deemed key to understanding human evolution. At least one other team, led by Cooper, has its own Neandertal project and is working to gather DNA from other ancient humans as well. “A whole new world has opened up with regard to what can be done with ancient DNA,” says Thomas Gilbert, a paleogeneticist at the University of Copenhagen, Denmark.

But despite the seductive promise of new techniques, researchers warn that ancient DNA has been a fickle mistress. Over the past 20 years, successes have been followed by frustration after frustration. It's hard to find suitable DNA, and it's also quite tricky to avoid contamination with modern genetic material and to cull errors. These issues may come back to haunt Pääbo and Rubin, says genomicist Stephan Schuster of Pennsylvania State University in State College. “The divergence [between living people and Neandertals] is so small compared to the DNA damage and the sequencing error” that it's hard to be confident of any results, he says. “If we've learned anything, it is that we generally haven't perceived the full extent of the problems and complexities of ancient DNA research,” admits Cooper. “We're still very much in the learning curve.”

## Ups and downs

Ancient DNA made its first appearance in 1984, when Allan Wilson of the University of California (UC), Berkeley, was able to tease out 100 bases from a quagga, an extinct species that looked like a cross between a horse and a zebra. A year later, Pääbo succeeded in extracting genetic material from a 2400-year-old Egyptian mummy.

The world was wowed by these successes, “but there was not much future in the field or the approach,” Pääbo recalls. DNA degrades after death, as water, oxygen, and microbes attack it, and the sequencing methods of the time demanded more DNA than was readily available from ancient specimens.

The polymerase chain reaction (PCR), which uses an enzyme to make millions of copies of a particular DNA fragment, seemed to be just what the field needed, offering a way to amplify and read a tiny bit of sequence. The technique powered analyses of quagga, Tasmanian wolves, moas, and other extinct species during the 1990s.

But reliable results from more ancient specimens proved hard to come by. The reaction also amplified age-induced errors and extraneous DNA. A few spectacular failures cast doubt on the whole field: Supposedly 25-million-year-old DNA from amberencased bees and even older DNA from dinosaurs turned out to be from living humans instead. Ancient human remains were especially problematic because of the specter of contamination: Anyone who handled bone could leave traces of their DNA upon it, and it was impossible to distinguish old from modern sequence.

Then in 1997, following new methodological guidelines, a team led by Pääbo, then at the University of Munich in Germany, and his student Matthias Krings restored the appeal of ancient DNA by decoding 379 bases of Neandertal mitochondrial DNA (mtDNA) (Science, 11 July 1997, p. 176). The bases were quite different from the equivalent modern human DNA, suggesting that Neandertals were a distinct species that split off from a common ancestor a half-million years ago and did not interbreed with modern humans. That and subsequent mtDNA and fossil studies supported the leading view that H. sapiens arose in Africa and spread around the globe, replacing other kinds of humans.

But in part because modern humans and Neandertals overlapped in Europe and west Asia for at least a few thousand years, and perhaps up to 10,000 years, some researchers had continued to argue that the two species interbred. They pointed out that 379 base pairs were too few to be conclusive. Also, because mitochondria are passed on only by the mother, nuclear DNA is needed to rule out the possibility of mixing.

## Making the dream real

But getting nuclear DNA from ancient bones was a tall order. Back in 1997, “it was just a dream,” Pääbo recalls. Because the amount of nuclear DNA in a cell is just 0.05% that of mitochondrial DNA, it's even harder to get enough nuclear DNA to sequence, particularly because often the DNA has disintegrated. Also, Neandertal bones are rare, and curators are reluctant to provide samples. But Pääbo's team devised a hierarchy of tests that required just a tiny amount of material to begin with.

First they tested a tiny, 10-milligram sample for intact proteins, as their presence suggests that DNA was preserved as well. Then they examined 150 milligrams to determine the ratio of Neandertal to modern human DNA, using existing Neandertal mtDNA as a guide. Two of the 70 samples they examined passed both tests with flying colors. So Pääbo's team sliced out a larger piece of one, a 38,000-year-old bone from Croatia, and extracted the DNA.

Meanwhile, Rubin had begun to think that the metagenomics approaches that he was pioneering to study microbial diversity would work with fossil DNA too. He suggested to Pääbo that Neandertal genomics might now be possible. After Rubin's postdoc James Noonan successfully sequenced 26,861 bases of cave bear DNA (Science, 22 July 2005, p. 597), Pääbo gave a sample of the Neandertal DNA to Noonan to work on.

The two teams embarked on parallel but independent analyses using different methods. Noonan first created a library of Neandertal DNA incorporated into live bacteria. As each bacterium replicated, it made copies of a particular fragment. The team employed a new, massively parallel technique called pyrosequencing, which uses pulses of light to read the sequence of thousands of bases at once. Sophisticated computer programs then compared the sequence fragments to available DNA databases and identified the potential Neandertal ones based on their similarity to modern human sequence. The team used several tests to rule out contamination with modern human DNA, such as checking that fragments had the correct flanking sequence and the expected amount of DNA damage for their size. In all, Rubin's team was able to extract 65,000 bases of Neandertal DNA.

Pääbo employed pyrosequencing too, but he used a different method to prepare the DNA. Schuster and Hendrik Poinar of McMaster University in Hamilton, Canada, had successfully used this technique to read an astonishing 13 million bases from a 27,000-year-old mammoth (Science, 20 January, p. 392). This procedure avoids using bacteria, which for unknown reasons sometimes fail to incorporate certain stretches of DNA and so may not provide a complete sequence. Instead, Pääbo's team coated tiny beads with Neandertal DNA fragments, one fragment per bead. Then each bead's DNA was amplified, independently, by PCR, and read using pyrosequencing.

Ed Green of Pääbo's lab and his colleagues sequenced 225,000 fragments of DNA, totaling millions of bases. But by comparing the sequences with those in existing databases, they found that “the vast majority [of the DNA]—94%—has nothing to do with the human genome,” says Pääbo, and came from sources such as soil microbes. Still, they identified a staggering 1 million bases of Neandertal DNA.

Green kept tabs on contamination in part by comparing stretches of mtDNA that showed up in the sequencing to known modern human and Neandertal mtDNA. They found little modern human mitochondrial sequence and say they are confident their Neandertal DNA is genuine.

Both teams compared the new sequences to the modern human genome and to the chimp genome and tallied the sequence differences between each pair of species. Places where the two human genomes match but the chimp's differs likely mark mutations that resulted in uniquely human changes, perhaps including our upright skeletons, bigger brains, lack of hair, and so forth. Differences between the two humans are signposts to changes that were key to their individual evolution. Eventually those changes could lead researchers to the genetic basis of H. sapiens speciation.

As expected, the Neandertal and human genomes proved more than 99.5% identical. Rubin's team's analysis of 65,000 bases revealed that the two humans shared 502 mutations that were different from chimp bases. And 27 bases varied between modern humans and Neandertals, indicating sites where evolution occurred after the two species diverged. Assuming that chimps and humans split 6.5 million years ago, the most recent common ancestor of the two human species lived 468,000 to 1 million years ago, most likely dating back 700,000 years, Noonan and his colleagues report.

In Green and Pääbo's much larger analysis, 10,167 bases were shared by just the modern human and Neandertal, and 434 were unique to modern humans. Taking a slightly different approach from Rubin, the Leipzig team found a more recent divergence time, about 465,000 to 569,000 years ago. This matches the mtDNA analyses, too, but doesn't quite settle the question. Not everyone agrees with the 6.5-million-year-old divergence date for humans and chimps, and a different date would change the timing of the split between modern humans and Neandertals.

As to the question of admixture, Rubin's group found no sign of it. There were no sites where the Neandertal possessed a rare single nucleotide polymorphism (SNP) found only in Europeans, which one would expect had interbreeding occurred. However, given the size of the study, there's still a chance that such shared SNPs exist but haven't yet been found, Rubin explains. So his study refutes the notion that Neandertals were major contributors to the modern human genome but can't rule out a modest amount of gene flow.

In contrast, the Leipzig group did find some evidence of hanky-panky between the two humans—although it's far from conclusive. They used the HapMap and another large catalog of modern human variation developed by a private company to guide them to potential SNP sites in the Neandertal. They found that at 30% of those sites, the Neandertal had the same base as living people, but the chimp had a different base. That's too much similarity, given how long ago the two lineages split. “Taken at face value, our data can be explained by gene flow from modern humans into Neandertals,” says Pääbo. He thinks there may have been one-sided mating: Modern human males invaded the Neandertal gene pool by sometimes fathering children with Neandertal females, but not necessarily vice versa.

To those who have long argued for Neandertal admixture—and been in the minority—this is vindication. “These comprise some of the strongest genetic evidence of interbreeding with Neandertals that we have yet seen,” says Milford Wolpoff, a biological anthropologist at the University of Michigan, Ann Arbor. But Stanford paleoanthropologist Richard Klein disagrees. “I don't think either paper bears much on the issue of admixture,” he says. Schuster is even more circumspect: “Both papers are overinterpreting the data.”

Rubin hopes that other researchers will do their own analyses on these publicly available data to help clarify the results. But Montgomery Slatkin, a theoretical population geneticist at UC Berkeley, thinks that even with more studies and more sequence, “it will be very difficult to distinguish between a low level of admixture and no admixture at all.”

## Concern about contamination

Anxiety about the sequence being wrong fuels this pessimism. Researchers need to be sure that what they called “Neandertal” isn't really “technician” DNA. And contamination is hard to avoid. “Bone acts like a sponge; a drop of sweat on the surface will penetrate very deep,” Schuster explains.

With nonhuman ancient DNA, researchers can easily pick out and discard modern sequences, but that's not possible with Neandertal DNA, which is nearly identical to our own, notes paleogeneticist Carles Lalueza-Fox of the University of Barcelona, Spain. He is not convinced that the tests for contamination are foolproof. “It might never be possible to determine if the amplified sequence is real or one of the many potential sources of contamination,” agrees Shapiro.

All the same, researchers are making some headway. Lalueza-Fox sequenced mtDNA from everyone who had ever touched a Neandertal specimen and compared it to the DNA obtained from the Neandertal. He found that most of the contamination came from the field, not the lab. His solution: Treat the excavation site like a crime scene. Archaeologists in his team now wear face masks, coveralls, and sterile gloves; they use sterile blades and quickly freeze bones destined for DNA sampling. The dress code has reduced human contamination from about 95% to 5%, says Lalueza-Fox.

Even if contamination can be contained, ancient DNA studies must contend with errors. Sequencing itself makes mistakes. And that's where Rubin's bacterial libraries come in handy. With an ever-reproducing source of DNA, his team can sequence the same fragment multiple times and therefore tell right from wrong bases. With Pääbo's method, the sample gets used up.

More problematic are those errors that have arisen from age-related decay. “Many, and perhaps most, observed differences between a Neandertal genome sequence and the human reference will be caused by [ancient] chemical damage to the Neandertal sample,” says Webb Miller, a computer scientist at Pennsylvania State University. One way to detect such errors is to sequence and compare several different specimens, because each fossil should have a unique pattern of DNA damage, says Miller.

Here, too, Rubin's methods can help. He envisions several libraries, each from a different Neandertal. Researchers would pull out the same fragment from each library to compare with each other and with living people. A pilot project has already demonstrated probes that ferret out specific target sequences, so the team needn't analyze the billions of bases shared by Neandertals and living humans, or among different Neandertals. “We will be able to identify and confirm sequence changes in more than one Neandertal without having to sequence several Neandertals to completion,” Rubin says. “Seeing the same change in multiple Neandertals will give us confidence that we got [the sequence] right.”

Such talk of multiple sequencing has some fossil guardians anxious. “If everybody that wanted a chunk of Neandertal got a chunk of Neandertal, that would put the whole Neandertal fossil record at risk,” warns paleoanthropologist Tim White of UC Berkeley.

At this point, however, even the paleontologists seem eager to see what genomic studies can do. This month, Lalueza-Fox will bring one of his “clean-room excavated” bones to Pääbo to see whether its DNA qualifies for sequencing, and he's thrilled with the potential of sequencing. “For the [150th] Neandertal anniversary, we are moving from paleogenetics to paleogenomics,” Lalueza-Fox explains. “It is incredible considering this was impossible just a few years ago.”

12. PALEOGENETICS

# A Neandertal Legacy?

1. Michael Balter

The perennial question about Neandertal-human relations is, “Did they mate?” (Science, 11 February 2005, p. 841). The lack of a strong Neandertal signature in the modern human genome means that such interspecies dalliances were probably rare, but the Neandertal nuclear DNA sequenced to date raises the possibility that interbreeding did happen (see main text). If so, there may be traces of Neandertal genes in living people, especially if the Neandertal variants were favored by natural selection. Now a handful of other studies are finding genes that may fit the bill.

“There is now a relatively long list of candidates” for such adaptive genetic variants, contends anthropologist John Hawks of the University of Wisconsin, Madison. But not all researchers agree. Population geneticist Laurent Excoffier of the University of Bern in Switzerland counters that it's “highly unlikely” there were enough matings between Neandertals and modern humans to have left significant traces in the modern genome.

The most recent candidate was reported last week in the Proceedings of the National Academy of Sciences by a team led by geneticist Bruce Lahn of the University of Chicago in Illinois. Lahn's team had earlier claimed that a variant of the brain-related gene microcephalin first appeared in modern humans about 37,000 years ago and quickly spread around the world because it was favored by selection (Science, 9 September 2005, p. 1662). In the new work, Lahn estimated that the variant actually arose in hominids more than 1 million years ago, long before it appeared in our own lineage. He suggests interbreeding, probably with Neandertals, as a likely explanation. “It seems to be the most compelling case to date for a genetic contribution of Neandertals to modern humans,” says Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany.

Similar candidates include a gene shown to have conferred a reproductive advantage in living Icelanders, a variant of a gene called MAPT implicated in neurological disease. As with microcephalin, the MAPT variant appeared in modern humans about 30,000 years ago but apparently arose in hominids much earlier and so may have come from Neandertals, according to recent work by John Hardy of the National Institute on Aging in Bethesda, Maryland.

There are several genetic variants whose roots go back as far as 2 million years ago but appeared more recently in modern humans, says geneticist Michael Hammer of the University of Arizona in Tucson. He says this pattern is best explained by occasional matings among different hominid groups within Africa as well as between African migrants and Eurasian hominids, including possibly Homo erectus. Even Chris Stringer of the Natural History Museum in London, who has argued that modern humans migrating from Africa replaced Neandertals with little or no interbreeding, now says that some interspecies matings are “feasible.”

Just why genes from Neandertals or other ancient hominids would have benefited modern humans remains a mystery. But if the geneticists are correct, it could mean that before Neandertals went extinct about 30,000 years ago, they left modern humanity with lasting gifts.

13. HIGH-TEMPERATURE SUPERCONDUCTIVITY TURNS 20

# High Tc: The Mystery That Defies Solution

After 2 decades of monumental effort, physicists still cannot explain high-temperature superconductivity. But they may have identified the puzzles they have yet to solve

Twenty years ago, a firestorm of discovery swept through the world of physics. German experimenter J. Georg Bednorz and his Swiss colleague Karl Alexander Müller kindled the flames in September 1986 when they reported that an odd ceramic called lanthanum barium copper oxide carried electricity without any resistance at a temperature of 35 kelvin—12 degrees above the previous record for a superconductor. The blaze ran wild a few months later when Paul Chu of the University of Houston, Texas, and colleagues synthesized yttrium barium copper oxide, a compound that lost resistance at an unthinkable 93 K—conveniently warmer than liquefied air.

A frenzy of slapdash experimenting and sensational claims ensued, says Neil Ashcroft, a theorist at Cornell University. He organized a session on the new high-temperature superconductors at the meeting of the American Physical Society in New York City the following March. The “Woodstock of physics” stretched until 4 a.m. and bubbled over with giddy enthusiasm. “We had prominent people saying it would all be explained quickly and that we would have superconducting power lines and levitating trains,” Ashcroft says.

Ashcroft himself had doubts, however, as he told a class of graduate students a few months later. (I was a member of the class.) The materials comprised four and five elements and possessed elaborate layer-cake structures. They broke the rules about what should make a good superconductor. In short, Ashcroft predicted, high-temperature superconductivity would remain the outstanding problem in condensed matter physics for the next 25 years.

That prognostication is coming true. Two decades after high-temperature superconductors were discovered, physicists still do not agree on how electrons within them pair to glide through the materials effortlessly at temperatures as high as 138 K. Researchers haven't failed for lack of trying. According to some estimates, they have published more than 100,000 papers on the materials. Several theorists claim they have deciphered them—although their explanations clash. Still, high-temperature superconductivity has refused to submit to some of the world's best minds.

“The theoretical problem is so hard that there isn't an obvious criterion for right,” says Steven Kivelson, a theorist at Stanford University in Palo Alto, California. Experimenters are producing a flood of highly detailed data, but physicists struggle to piece the results together, says Joseph Orenstein, an experimenter at the University of California, Berkeley, and Lawrence Berkeley National Laboratory. “It must be close to unique to have so much information and so little consensus on what the questions should be,” Orenstein says.

The problem is more than a sliver under the nail. High-temperature superconductivity has shown that physicists' conceptual tools can't handle materials in which electrons shove one another so intensely that it's impossible to disentangle the motion of one from that of the others. Such “strongly correlated” electrons pop up in nanodevices and novel magnets, organic conductors and other exotic superconductors. “High-temperature superconductivity is the stumbling block of the whole discipline of condensed matter physics,” says Peter Abbamonte, an experimenter at the University of Illinois, Urbana-Champaign.

In spite of the difficulty of the puzzle, many physicists say they are closing in on a solution. Most now agree on certain key properties of the materials. Precision experiments are revealing surprising details of the compounds. And computer simulations—and perhaps even mockups fashioned of ultracold atoms and laser light—could soon show physicists whether their basic model of the problem is correct. “If I had to make a prediction,” Kivelson says, “I would say that in 10 years time the problem will be solved.”

## The ultimate chess game

Even “conventional” superconductivity, which was discovered in 1911, is mind-bending. Electrons in a metal move in quantum waves of distinct energies. Quantum mechanics prohibits two electrons from occupying the same wave or “state,” so they stack into the states from the lowest energy on up. But when metals such as lead and niobium are cooled to near absolute zero, the electrons in them can lower their total energy by pairing like ballroom dancers. That partnership produces superconductivity, as explained in 1957 by theorists John Bardeen, Leon Cooper, and John Robert Schrieffer.

The pairing alters the spacing of the rungs on the energy ladder, creating a gap near the top of the stack. To break from its partner, an electron must jump the gap to an empty state. There isn't enough energy around to allow that, so the pairs glide along unperturbed. Something must glue the pairs together, and according to the Bardeen-Cooper-Schrieffer (BCS) theory, the adhesive is quantized vibrations of the crystalline material, or “phonons.” A passing electron attracts the slower-moving ions in the crystal lattice, which squeeze together to produce a knot of positive charge that attracts another electron (see diagram).

High-temperature materials literally take superconductivity to a new plane. The compounds contain planes of copper and oxygen ions that resemble chess boards, with a copper ion at every corner of a square and an oxygen ion along each side. Electrons hop from copper ion to copper ion. Between the planes lie elements such as lanthanum, strontium, yttrium, bismuth, and thalium. But it is along the copper-and-oxygen planes that the electrons pair and glide.

Just how that happens is anything but clear. The electrons in an ordinary metal hardly notice one another and interact mainly with phonons. In contrast, the electrons in high-temperature superconductors shove one another so mightily that they tend to jam up with one electron on each copper ion, like gridlocked commuters. That impasse can be broken only by tweaking the material's chemical composition to siphon away some of the electrons to create positively charged “holes,” a process called doping.

The challenge then is to explain how electrons that fiercely repel each other manage to pair anyway. Some researchers argue that waves of magnetism play a similar role to the one phonons play in conventional superconductors. Others focus solely on how the electrons shuffle past one another in a quantum-mechanical game of chess. Still others say that patterns of charge or current, or even phonons, play a crucial role. Pairing might even require all of these things in combination, which would be many physicists' nightmare scenario.

## Familiar solutions

Some of the theories being refined today emerged soon after Bednorz and Müller's discovery, and the dividing lines that run through the field were drawn in those heady days. For example, as early as 1987 some theorists argued that high-temperature superconductivity arose not from phonons but from the interaction of the electrons alone. But even those who agree on that principle often disagree on the details.

The idea that waves of magnetism drive the superconductivity is based on the fact that electrons act like little magnets. Those on adjacent copper ions point in opposite directions, creating an up-down-up-down pattern known as antiferromagnetism. The electrons can tilt and flip, and waves of wobble coursing through this arrangement can provide the glue for pairing, says David Pines, a theorist at Los Alamos National Laboratory in New Mexico and the University of California, Davis.

But Philip Anderson, a theorist at Princeton University, says that no glue is necessary. Just months after the discovery of high-temperature superconductors, he proposed a scheme known as the resonating valence bond (RVB) theory, which focuses on subtle quantum connections between electrons on neighboring copper ions. In the theory, no waves of any kind pass between electrons, Anderson says.

Thanks to the weird rules of quantum mechanics, each electron can point both up and down simultaneously. Moreover, neighboring electrons can join in an odd quantum state called a singlet in which both electrons point both up and down at once, but the two electrons always point in opposite directions—either down-up or up-down. When enough holes open in the plane, singlets form and begin to slide freely past one another, eventually producing superconductivity.

Others contend that both the magnetic fluctuation and RVB theories leave out some essential piece of physics. Stanford's Kivelson believes stripes of electric charge on the planes, which have been seen in some materials, may be necessary to trigger the pairing. Chandra Varma, a theorist at the University of California, Riverside, proposes that loops of current flowing inside each copper-and-oxygen square are key.

Still others argue that high-temperature superconductivity may not have one root cause. “There is no silver bullet that is going to explain everything,” says Thomas Devereaux, a theorist at the University of Waterloo in Canada. The fact that materials with very similar structures have very different critical temperatures shows that the copper-and-oxygen planes are not the whole story, he says. Phonons may still play an essential role, such as driving up the critical temperature, Devereaux says.

As in the beginning, the field is contentious. Recent experiments have hinted at current loops. “If these are accepted, the theoretical game is over,” Varma says. “That's why no one wants to accept it.” Anderson is equally convinced that his RVB theory is correct—and underappreciated. “Eighty percent of the field is against anything—especially anything that might solve the problem,” he says.

## Mapping out the mysteries

In spite of the discord, researchers have made progress—especially the experimenters. For example, in 1994, John Kirtley and Chang Tsuei of IBM's T. J. Watson Research Center in Yorktown Heights, New York, probed the shape of the cloudlike quantum wave that describes the paired electrons. In a conventional superconductor, electrons can pair in any direction and can sit on top of each other, so the wave is a sphere. In high-temperature superconductors, Kirtley and Tsuei found, the cloud is shaped like a four-leaf clover. That “d-wave” shape means that paired electrons sit on adjacent copper ions and never on the same ion.

D-wave pairing would be hard to explain with phonons, but it had been predicted by Anderson and others who favored purely electronic theories. As a result, even most of those who say phonons play a role do not believe that they alone cause pairing.

By dint of a variety of experiments, researchers have also agreed upon the properties common to all the materials, which change with the amount of doping. Cook up an undoped material, and it's an antiferromagnetic insulator. Dope it to draw between 6% and 22% of the electrons out of the planes, and it's a superconductor. Dope it more, and it becomes an ordinary metal.

These properties can be plotted on a “phase diagram” that, like some medieval map, charts the mysteries physicists face (see figure, above). “To solve the whole problem, you're going to need to understand the whole phase diagram,” says Séamus Davis, an experimenter at Cornell University. “It could be that focusing on the mechanism is the reason that the mechanism hasn't been found.”

Most intriguingly, at low doping a gap opens in the ladder of electron energy levels even at temperatures far above the superconducting transition. That “pseudogap” suggests that electrons pair at such toasty temperatures, and that superconductivity arises when the “preformed” pairs gather into a single quantum wave, some researchers say. “Everything we have seen goes in that direction,” says Øystein Fischer, an experimenter at the University of Geneva, Switzerland. And Tonica Valla of Brookhaven National Laboratory in Upton, New York, and colleagues present data online in Science this week (www.sciencemag.org/cgi/content/abstract/1134742) consistent with this interpretation.

Preformed pairs are too much to swallow for other researchers, who say the pseudogap is a sign of something else that clashes with superconductivity. For example, Zhi-Xun Shen of Stanford University and colleagues argue online in Science this week (www.sciencemag.org/cgi/content/abstract/1133411) that there may be two different gaps. Either way, the strange state might hold the key to explaining high-temperature superconductivity, says Michael Norman, a theorist at Argonne National Laboratory in Illinois. “The thing that explains the pseudogap may explain the superconductivity as well,” he says.

## Computers and cold atoms

Ultimately, the mystery of high-temperature superconductivity may be solved not in the lab or at the theorist's chalkboard but in the heart of a computer. Some theorists have turned to numerical simulations of the electrons hopping around the copper planes. If everything springs from the interactions between the electrons alone, then all the different phases and perhaps even the pairing mechanism should emerge from such simulations, much as the double helix, genes, and the mechanism of transcription arise from chemical interactions between the building blocks of DNA.

The mathematics can vary, but theorists generally study a scheme known as the Hubbard model, in which the only adjustable parameters are the ease with which the electrons hop and the strength with which they repel each other. Tracking electrons on a grid might sound easy, but the complexity of the quantum-mechanical calculations limits researchers to grids of a few dozen lattice sites. And still they must use approximation schemes to keep the calculation manageable.

Such simulations have begun to reproduce pairing, stripes, and features of the pseudogap, says André-Marie Tremblay, a theorist at the University of Sherbrooke in Canada. Unfortunately, different approximation methods can lead to different results for the same parameters, says Douglas Scalapino of the University of California, Santa Barbara. But that's not necessarily a bad thing, he says, because that very sensitivity suggests that the Hubbard model can produce a variety of effects with only a little tweaking, just as high-temperature superconductors do. “I interpret that to mean we have the right model,” he says.

Meanwhile, a wild new kind of simulation could be gearing up to leave computer simulations in the dust. Physicists have begun to construct artificial crystals by suspending ultracold atoms in corrugated patterns of laser light. In such an “optical lattice,” the spots of light play the role of the ions, and the atoms play the role of the hopping electrons. The setup might be used to create an incarnation of the Hubbard model with hundreds of lattice sites and parameters that researchers can tune just by adjusting the spacing and the brightness of the spots.

Several groups are already racing to produce such systems. “In very quick succession, we have jumped over the first few hurdles, and maybe the number of hurdles ahead of us is not that much bigger than the number behind us,” says Wolfgang Ketterle, an experimenter at the Massachusetts Institute of Technology in Cambridge. Using optical lattices, experimenters could map out the phase diagram of the Hubbard model within a few years, says Henk Stoof, a theorist at Utrecht University in the Netherlands. “They have all the things they need to do it,” he says.

But even if such simulations do produce superconductivity, they may not yield a conceptual understanding of the pairing, some researchers say. Others question the relevance of the simulations to high-temperature superconductors. “We don't know that the Hubbard model is what's going on in the [materials],” says Cornell's Davis. “That's a hypothesis.”

## A threshold

Even without a theory to explain the materials, physicists agree that the pursuit of high-temperature superconductivity has already paid off handsomely. “It has led to the discovery of new materials, of new states of matter, of new concepts,” says Aharon Kapitulnik, an experimenter at Stanford University. Shen says that in their quest to unravel the phenomenon, experimenters have honed their techniques to new levels of sensitivity, precision, and speed. “High-temperature superconduct ivity has completely changed the landscape of experimental condensed matter physics,” he says.

At the same time, condensed matter researchers have come to see high-temperature superconductivity as the gateway to a new area of study: strongly correlated electrons. “This problem of strongly correlated electrons is the new frontier,” says Argonne's Norman, “and high-temperature superconductors have brought it to the fore.” Two decades after their discovery, high-temperature superconductors are viewed less as a singular mystery and more as a threshold to new realms of physics. Physicists hope it won't take another 20 years to cross it.

14. HIGH-TEMPERATURE SUPERCONDUCTIVITY TURNS 20

# The Next Big Hurdle: Economics

1. Robert F. Service

Researchers have solved most of the technical challenges. Now companies are struggling to make HTS devices competitive

Six months after J. Georg Bednorz and Karl Alexander Müller discovered that a family of ceramics could conduct electricity without the electrical equivalent of friction, the scientific buzz swelled into full-scale hype. News accounts gushed at the prospect of magnetically levitated trains, novel sensors, superfast superconducting computers, and of course, lossless electricity transmission cables. For a generation that grew up watching the technological utopia of the Jetsons, the future, it seemed, was just around the corner.

The trouble is, it's still there. Two decades into the revolution, the effort to commercialize high-temperature superconductors (HTS) is not for the faint-hearted. Successful applications exist, although with names and roles that few people would recognize, such as current leads and cellular base station filters. And although those and other niche applications are turning a profit for their owners, the field is nothing like its founders envisioned. “In my opinion, we oversold high-temperature superconductivity,” says Lucio Rossi, who heads the magnets and superconductors group at CERN, the European particle physics laboratory near Geneva, Switzerland.

Today's outlook is decidedly less rosy. “It's very difficult to make money on HTS,” says John Rowell, a physicist at Arizona State University in Tempe, who notes that no venture capital-funded HTS company in the United States has ever had a year of profitability. Still, hope springs eternal, and after 20 years of development, HTS equipment makers seem to be finding ground beneath their feet. “It's a slow process,” says Al Zeller, a superconducting magnet expert at Michigan State University in East Lansing. “But the applications are taking off.” “The materials science in HTS has been terrific,” says Bruce Strauss, who helps run the superconductivity program at the U.S. Department of Energy (DOE). “The engineering is just beginning. I've been seeing a lot more engineering than before of motors, coils, and so on. That's a good sign.”

## Slowing to a crawl

Part of what made the HTS revolution so exciting was that the novel superconductors looked and acted so differently from conventional low-temperature superconductors (LTS). The earlier materials were ductile metals, such as the alloy niobium-tin, that could be forged into wires for power cables or wound into spools for use in magnets, a key component for motors and generators.

HTS materials, by contrast, are brittle ceramics. In the early discoveries of HTS materials, researchers placed electrodes on opposite sides of a millimeter-sized ceramic fleck or perhaps a few-centimeters-long film of the material. That setup worked to show the drop in resistance characteristic of the onset of superconductivity. But nobody knew how to turn these hard, brittle flecks into kilometers of wire.

Part of the problem is that electrons passing through HTS materials, unlike those in conventional superconductors, prefer to travel in particular directions through the material's atomic lattice. Separate grains of the material must line up so that electrons can hop from one to the next; if the alignment between the atomic lattices of separate grains is off by more than several degrees, the conductivity of the material drops dramatically.

Researchers found that creating this alignment was fairly easy in a superconductor made from bismuth strontium calcium copper oxide (BSCCO). In BSCCO, the grains resemble flat rectangular plates. By 1988, researchers learned that by packing BSCCO in a silver tube and stretching the silver into a long thin wire, they could align the BSCCO grains closely enough to carry moderate supercurrents through the material.

Today, BSCCO wire technology has advanced considerably beyond its early days. Several superconducting wire companies now turn out kilometers of such first-generation (1G) wire. That has led to a slew of prototype devices, such as power cables, high-efficiency industrial motors, lightweight ship propulsion systems, and electricity-storing flywheels.

BSCCO has key shortcomings, however. Its silver sheathing makes it expensive, and strong magnetic fields sap its ability to carry a supercurrent unless the temperature around it drops much closer to absolute zero. But lowering the temperature limits its advantage over LTS materials.

In some cases, marrying 1G wires with more conventional materials can solve the magnetic-field problem. This year, for example, researchers at Sumitomo Electric in Japan unveiled a prototype motor for ship propulsion that relies on 1G wire wrapped around an iron core, which helps the wire withstand the high magnetic fields generated. The motor, a mere 10% of the volume and 20% of the weight of a conventional motor, also saves considerable fuel. A separate 36.5-megawatt HTS ship motor, produced by American Superconductor Corp. in Westborough, Massachusetts, has passed factory tests and is expected to be delivered to the U.S. Navy later this year.

Researchers around the globe have been making steady progress on another track as well. In 1995, physicists at Los Alamos National Laboratory in New Mexico and Oak Ridge National Laboratory in Tennessee turned a different cuprate superconductor—YBCO—into a 2.5-centimeter wire that could carry 1 million amps per square centimeter of cross section of the wire. Thanks to YBCO's cheaper starting materials and better ability to withstand high magnetic fields, the novel second-generation (2G) wire held out new hope for making cheaper magnetic field-resistant wires.

But scaling up those short wires turned out to be another major challenge for the field. YBCO's various-shaped grains proved much harder to orient than the platelike grains in BSCCO. Coaxing them into alignment required years of development work. Today, researchers and companies use a variety of techniques to coat a nickel wire with ultrathin layers of materials that orient the YBCO grains as they grow on top.

## Ripe for a change?

That strategy seems to be paying off. The August Applied Superconductivity Conference in Seattle, Washington, showcased several companies that are moving YBCO into applications. Xuming Xiong, a materials scientist with SuperPower, an HTS wire maker based in New York, reported that his company now routinely produces 700- to 800-meter-long 2G wires and is building a pilot plant to make wires that exceed 1 kilometer. Companies are already using these long 2G wires to make solenoid coils for superconducting magnets. Researchers at American Superconductor, for example, reported that they had made a solenoid coil that generates a 1.5-tesla magnetic field from 420 meters of superconducting coil. Researchers at Fujikura in Japan announced a similar 1-tesla magnet.

With 1G wires now well into their commercialization effort and 2G wires working their way into prototype products, widespread applications of HTS now seem possible for the first time. The timing is fortunate, says James Daley, a superconductivity program manager at DOE, because the U.S. electricity grid is aging and due to be replaced, while demand for electricity continues to rise at about 2.6% per year. In many cases, utility companies can meet that rising demand by stringing extra power lines. But in many urban centers, gaining new rights of way to string power lines isn't possible. That has opened a window for superconducting cables, which can carry up to five times more power than conventional power lines can. DOE estimates that some 3500 kilometers of underground power cables in the United States could be converted to HTS cables.

Three demonstration cable projects are under way in the United States. On 20 July, for example, a collaboration of HTS companies, utility-grid operators, and a cooling specialty company connected the first HTS power cable to the U.S. grid in Albany, New York. That 350-meter cable is expected to supplement the current grid to carry enough extra energy to power more than 70,000 homes. Later this year, a 30-meter-long section of this HTS cable will be upgraded with a 2G cable.

Wires and cables are by no means the only starting materials for HTS applications. Researchers around the globe routinely craft dozens of different types of electronic, magnetic, and optical sensors from highly pure thin films of HTS materials. These sensors are used in applications as diverse as quantum computing experiments and medical imaging devices that track minute magnetic differences in tissue. For now, however, many experts think the biggest potential for HTS lies in the electric utility business. “The major impacts of HTS will not be felt until power applications become a reality,” says Paul Chu, a superconductivity expert at the University of Houston in Texas and president of the Hong Kong University of Science and Technology.

Despite the initial success of the power cables that have been installed so far, the HTS industry still faces an uphill battle. In the minds of utility executives, HTS “is still a very high-risk technology,” says Alan Lauder, executive director of the Coalition for the Commercial Application of Superconductors in Kennett Square, Pennsylvania. Utility companies face a powerful disincentive to adopting new technology because they typically cannot pass the cost along to customers. They also worry about the reliability of unproven technology. “If you want something that will live for 30 years and never fail, you will not accept a 2-year demonstration project,” says Lauder.

Some HTS applications are unique, however, or boast advantages that far outstrip those of conventional technology. One example, Lauder says, is utility-grid devices called fault current limiters that prevent power surges from destabilizing local grids. Another set of utility devices, called HTS dynamic synchronous condensers, help the grid maintain a near-uniform level of electrical “pressure” or voltage. Both applications are expected to be widely adopted in the next few years, Lauder and others say.

Still, despite a few successes, the outlook for HTS products remains mixed. Prototype magnetically levitated trains are running in Germany, China, and Japan, but it's not clear whether their use will spread. Plans for superconducting computers washed out years ago. And even one successful application—electronic filters used in the base stations that route cell phone calls—could be overtaken by improvements in conventional filter technology. “Customers do not care about science,” says Arizona State's Rowell. “They care about a box with high performance and low cost.”

Cost remains the biggest challenge for HTS devices, particularly for HTS wire that's priced three to f ive times higher than its copper equivalent. With the production of 2G wire now being scaled up, companies say they hope to close that gap by the end of the decade. But if the last 20 years have offered any lesson for entrepreneurs, it's that HTS hype inevitably yields to a more sober assessment of what the new science can deliver to the marketplace.

15. HIGH-TEMPERATURE SUPERCONDUCTIVITY TURNS 20

# Determined Duo Scored a Victory for Small-Scale Science

1. Daniel Clery

Two-man teams rarely win physics Nobels, but the discoverers of high-temperature superconductivity showed the power of in-depth knowledge and a good hunch

In 1986, the 75th anniversary of the discovery of superconductivity, the field was literally stuck in a deep freeze. Researchers' main goal was to raise the critical temperature (Tc) below which a material conducts electricity with no resistance, but progress had been slow. Starting at 4.2 kelvin in 1911 with mercury, Tc was pushed up from the 1930s onward with a series of intermetallic compounds—crystals made from different metals—all involving niobium. By 1973, the best Tc had reached just 23.3 K (in Nb3Ge), and there it had stuck.

Researchers at IBM's Zurich Research Laboratory in Rüschlikon, Switzerland, like many others, decided it was time to take a new approach. In 1983, physicist Karl Alexander Müller, a Rüschlikon researcher for 2 decades, asked J. Georg Bednorz, a crystallographer specializing in materials known as perovskites who had joined IBM the previous year, to help him with an unlikely project: searching for new superconductors in complex metal oxides, materials usually known as insulators.

Müller had recently returned from a 2-year sabbatical at IBM's research center at Yorktown Heights, New York, and was fired with enthusiasm to study superconductors. Years of work with a perovskite oxide of strontium and titanium (SrTiO3) had convinced him that perovskites had potential as superconductors. SrTiO3 was already known to superconduct at the low temperature of 0.3 K, and in 1978, Bednorz, then at the Swiss Federal Institute of Technology (ETH) in Zurich, had collaborated with Rüschlikon's Gerd Binnig in coaxing the Tc of SrTiO3 up to 1.2 K by doping the crystal with niobium. Another perovskite, made from barium, lead, and bismuth, had been shown in 1975 to have a Tc of 13 K.

Perovskite crystals with the right combination of metal ions, Müller concluded, would be conductors with a strong coupling between electrons and phonons, ripples in the crystal lattice that in conventional superconductors act as the glue to stick electrons together in pairs—an essential part of the superconducting process.

Bednorz and Müller started out with perovskites of lanthanum, nickel, and oxygen and systematically replaced varying amounts of the nickel with aluminium, the lanthanum with yttrium, and finally the nickel with copper. But superconductivity remained elusive. In 1985, they paused to survey the literature. Late that year, Bednorz came across a paper by French researchers that described a perovskite of barium, lanthanum, and copper. The French team was interested in its catalytic properties at higher temperatures, but Bednorz realized that it fitted his and Müller's conceptual model perfectly.

Bednorz immediately set about fabricating samples of the Ba-La-Cu oxide. But other duties in the lab and a vacation during January kept him from testing the samples until late January 1986. As he described later in his Nobel lecture, Bednorz cooled down a sample connected to a probe to measure resistivity. At first, the sample appeared to behave like a conductor; then, at 11 K, the resistivity dropped away. Over the next 2 weeks, the pair repeated the experiment over and over, varying the composition of the oxide until they had a material in which the resistivity reliably dropped at 35 K. That was incredibly high by the prevailing standards of superconductivity. But they were well aware that the field was littered with similar claims that could never be reproduced. Nevertheless, given the importance of the discovery, Bednorz and Müller published a paper in the journal Zeitschrift für Physik describing “Possible High Tc Superconductivity” before they were 100% sure.

They were missing a crucial piece of evidence: the Meissner effect, the ability of a superconductor to expel all magnetic flux from its interior. The pair had an agonizing wait for the delivery of a DC SQUID magnetometer to perform the magnetic measurements. In September, the machine was installed in the Rüschlikon lab. Bednorz, with the help of Masaaki Takashige, a Japanese researcher visiting Rüschlikon for a year, measured the material's magnetic susceptibility and confirmed the Meissner effect at about the same temperature as the resistivity drop. Bednorz and Müller were now confident that they had found a new class of superconductor and published their new results in Europhysics Letters.

## Heating up

As the pair expected, Bednorz says, their early talks about the discovery drew a fair amount of skepticism. In late November, however, newspapers reported that Shoji Tanaka's group at the University of Tokyo had successfully repeated their experiments, and Paul Chu's team at the University of Houston in Texas quickly added further confirmation. Chu went on to boost the Tc to 50 K by putting the sample under hydrostatic pressure and then, by replacing the lanthanum with yttrium, achieved superconductivity at the unimaginably high temperature of 93 K.

Suddenly, superconductors were the hottest show in town. Dozens of groups were replicating the IBM work and trying different oxides. “It was an exciting time,” says Bednorz. Such was the flood of new work that at the March 1987 meeting of the American Physical Society in New York City, a special evening session was hastily organized to hear some of the new work. Organizers allowed researchers just 5 minutes each for their pre-sentations, as 1800 people crammed into a conference room designed for 1100 and another 2000 watched outside on TV screens. This session, dubbed the “Woodstock of physics,” continued into the early hours of the next morning.

While Müller attended the New York meeting as a guest speaker, Bednorz was invited to a meeting of the German Physical Society at his alma mater, the University of Münster. The superconductivity session was so crowded that he had trouble getting in. When he politely asked the person blocking the doorway if he could get through, he was told: “Look, we all want to get in.” Bednorz says there followed a period of almost constant travel, mostly to the United States and Japan. He gave 52 talks in 9 months. “It was hard to get any work done,” he says.

After receiving the Nobel Prize in physics in December 1987—the shortest gap between discovery and award of any Nobel—things settled down a bit. Bednorz and Müller returned to characterizing the materials and trying other combinations of metals in their perovskites. “We had modest success, but others were quicker,” Bednorz says, and the big advances happened elsewhere.

In the early 1990s, Bednorz began experimenting with growing perovskites in thin films on various substrates. They were partly searching for potential applications in electronics, but the technique of growing superconducting oxides by epitaxy—depositing them layer by layer with evaporated atoms—also opened avenues for basic research. Just as Chu's group had boosted Tc with hydrostatic pressure, Bednorz and colleagues studied how Tc changed when they put the superconductor under compressive or tensile stress by using a substrate in which the period of the lattice was slightly smaller or larger than that of the superconducting oxide. Also, by growing a superconductor in a layered semiconductor structure, they could add or remove charge carriers from the oxide to see what effect that had on its conduction.

In the years following the discovery, Müller collaborated on superconducting applications with researchers at Los Alamos National Laboratory in New Mexico as well as at the firm American Superconductor in Westborough, Massachusetts. He has also remained active in unraveling the mechanism behind their discovery, sticking to his original thesis that a quasi-particle in the crystal lattice called a polaron—an electron and the deformed and polarized lattice around it—is key to the process.

Most superconductivity theorists now believe that such lattice vibrations don't play a role in high-temperature superconductivity. But Müller maintains that the superconducting layer in the perovskites is not homogeneous and that small areas within it harbor polarons, and he cites some experimental results to support his case. “Experimentalists are leading now. Theorists should listen for a bit,” Müller says. Paul Grant, an emeritus IBM researcher at the Almaden research center and longtime colleague of Müller, says the idea is worth watching. “He's definitely in the minority, but it's hard to see where he's really going wrong,” Grant says. “If he's right, he deserves another trip to Stockholm.

## Beyond superconductivity

In the mid-1990s, Bednorz changed tack. “I went back to my origins, insulating perovskites, the field that brought myself and Alex Müller together,” he says. He is now working on insulating materials that can be momentarily converted into a conducting state with an injection of charge carriers. That's useful because short voltage pulses can flip such materials between two resistance states, each of which is entirely stable on its own and doesn't need any power to keep it that way. If that can be achieved on a minute scale, the material could form the basis of a nonvolatile memory, a chip that remembers data when the power is switched off—a major goal for the computer industry. Bednorz and his colleagues have demonstrated switching in cells as small as 100 by 200 nanometers. “If we can get to these dimensions, with single cells working reliably, it will be very competitive,” he says.

Müller, meanwhile, became influential in Swiss science policy circles, and he began to press the government to build a national synchrotron facility. To pursue that goal, a group formed around him that became known as the “Alex Müller Committee,” says Leonid Rivkin of the Swiss Light Source, which finally opened its doors in 2001. “I try to help science in such ways,” Müller says.

Nine years ago, Müller retired from IBM and moved completely to the University of Zurich, where he had held a part-time position for some years. At age 80, he is still active, writing papers and giving talks. He's got a hunch that sooner or later researchers will come across an entirely new class of high-temperature superconductors. “It's highly unlikely that there isn't another class,” he says, adding that he has some ideas but declining to name them.

Grant respects the two researchers highly. “Georg's persistence in pursuing Alex's ideas was key,” he says. And Müller's impact on research at IBM has been huge. “He identifies good problems, places to explore, and good people.” While Müller headed the physics section at Rüschlikon, he not only won a Nobel Prize but also hired three other researchers who became Nobelists. “All that occurred because of Alex Müller.”