News this Week

Science  02 Jan 2009:
Vol. 323, Issue 5910, pp. 22
  1. THE TRANSITION

    Holdren Named Science Adviser, Varmus, Lander to Co-Chair PCAST

    1. Eli Kintisch,
    2. Jeffrey Mervis

    In 1997, Harvard physicist John Holdren co-chaired a report by a White House advisory panel that recommended transforming the federal science enterprise to “maintain [U.S.] leadership in the science and technology of energy.” In the ensuing decade, Holdren's reputation as an international energy maven has steadily risen. But his pitch for more federally funded research, for everything from greater energy efficiency to solar and wind technologies, hasn't changed. In April 2008, for example, he told late-night TV host David Letterman that the government “should be quadrupling to ten-tupling the amount of money we're putting into energy research and development.”

    Now Holdren will have a chance to make the case for such increases from the inside. In a 20 December 2008 radio address, President-elect Barack Obama announced that he has picked the 64-year-old career academic to be his science adviser and director of the White House Office of Science and Technology Policy (OSTP). In that job, he will also co-chair the President's Council of Advisors on Science and Technology (PCAST), which issued the 1997 report. Obama also chose biomedical heavyweights Harold Varmus, president of the Memorial Sloan-Kettering Cancer Center in New York City, and Eric Lander, director of the Broad Institute in Cambridge, Massachusetts, to be the two outside co-chairs of PCAST, and Jane Lubchenco, a marine ecologist at Oregon State University in Corvallis, as administrator of the National Oceanic and Atmospheric Administration.

    The U.S. research community is ecstatic with those selections and the previous announcement that Nobelist Steven Chu would be nominated as energy secretary (Science, 19 December 2008, p. 1774). All are already respected scientific leaders. (Holdren and Lubchenco are former presidents of AAAS, which publishes Science, and all five are members of the National Academy of Sciences.)

    When combined with Carol Browner, who has been named to the new position of White House energy and climate czar, the appointments portend an aggressive effort to carry out Obama's campaign commitment to make energy “the number-one priority.” In particular, they augur well for a massive government effort in renewable energy technologies and in setting tough limits on carbon emissions. “If you were going to launch an Apollo-like project [in energy], you would make appointments like the ones he's made,” says former presidential science adviser Neal Lane. “They're superb.” The crowd has also cheered the speed with which the next president has acted: President George W. Bush waited until June to name his science adviser, John Marburger.

    Added energy.

    John Holdren brings expertise in energy, climate, nuclear proliferation, and other topics to his job as presidential science adviser.

    CREDIT: KEITH SRAKOCIC/AP PHOTO

    But that euphoria will be tested in the weeks and months to come. The new president still has several important science slots to fill. The list includes the heads of the National Institutes of Health and the Food and Drug Administration. The betting is heavy that there will also be new leaders at NASA, the Centers for Disease Control and Prevention, and the Department of Energy's (DOE's) Office of Science, among other posts.

    There's also the question of how Obama plans to deploy his stable of talent. Marburger says the plethora of energy expertise among top Obama officials will mean that OSTP will “have to be redefined in relation to these other centers of formulating policy.” The divvying up of tasks has already begun. Last week, for example, it was Browner, not Chu, who briefed Vice President-elect Joe Biden on the outlines of a stimulus package expected to include both traditional construction projects and investments in green technology. Scientific groups have been lobbying heavily for research to be included in that package, which congressional Democrats hope to have ready for the president shortly after he's sworn in on 20 January.

    Holdren runs the Woods Hole Research Center in Massachusetts, which studies how human activity affects the global environment. Trained as a plasma physicist, he's been an international energy guru for 3 decades at the University of California, Berkeley, and at Harvard. OSTP manages much more than just energy, of course. But colleagues at Woods Hole say he's up to tackling policy questions in biology, environmental science, and nuclear proliferation, his specialty.

    The other appointees also boast impressive resumes. After winning the 1997 Nobel Prize in physics for his work on supercooled atoms, Chu put aside his academic career to develop the next generation of biofuels and alternative sources of energy. Since 2004, he has been director of DOE's Lawrence Berkeley National Laboratory. Lubchenco has devoted her professional career to studying the sustainable management of marine environments, including the impacts of climate change, and communicating that knowledge to the public.

    Graham Allison, Holdren's boss at Harvard's Belfer Center for Science and International Affairs, predicts he will make a smooth transition from outsider to insider. “John is the very model of a policy-relevant scientist,” says Allison, a former top defense adviser in the Administrations of Presidents Ronald Reagan and Bill Clinton. In practice, however, Holdren's effectiveness will be shaped largely by factors that are out of his control (see p. 28). Although nobody is betting that DOE's research budget will grow 10-fold, for example, Holdren would be a hero if he helps Obama achieve his goal of a 10-year doubling of the budget for U.S. basic research, including a massive bump in the energy sciences.

    Scientists could get a glimpse of the new Administration's approach to science policy when the rest of the members of PCAST are announced. In his December 2008 radio address, Obama promised “to remake PCAST into a vigorous external advisory council that will shape my thinking.” Varmus, who chaired a similar but unofficial group of experts that advised the Obama campaign, told Science recently that he, Lander, and Holdren have begun talking about a “reconstituted PCAST.”

    Obama has already gotten some surprisingly candid advice about PCAST from Marburger and his co-chair, venture capitalist Floyd Kvamme. A recent self-assessment of the council's record included recommendations to make it smaller, with more working scientists, and to give it a more active role in interacting with the White House, other executive agencies, and Congress. Whether or not Obama grants PCAST a larger role, his decisions to date suggest that he is prepared at least to listen to many of the nation's top scientists.

    A brief version of this story ran on Science's policy blog, http://blogs.sciencemag.org/scienceinsider

  2. SCIENTIFIC COLLABORATION

    Tehran Incident Threatens U.S.-Iran Project

    1. Richard Stone

    A member of a U.S. scientific delegation headed by the president of the Institute of Medicine (IOM) was interrogated for 9 hours last month in his Tehran hotel, casting a pall over years of painstaking efforts to cultivate ties between the Iranian and U.S. scientific communities. In a statement last week (nationalacademies.org/morenews/20081226.html), the U.S. National Academies labeled the incident a “serious breach,” and its three presidents declared that they “cannot sponsor or encourage American scientists to visit Iran unless there are clear assurances that the personal safety of visiting scientists will be guaranteed.”

    The incident involved Glenn Schweitzer, director of Eurasian programs at the academies. He was staffing a delegation led by IOM President Harvey Fineberg that was exploring opportunities for collaboration in the medical sciences. On 4 December 2008, three men who claimed to be security officers confronted Schweitzer and spent 3 hours grilling him in his hotel room. They returned 2 days later for another 6 hours of questioning. The men threatened to prevent Schweitzer from leaving Iran and told him that exchange visitors are not welcome.

    “This really was a big surprise. It's a risk we did not expect at all,” says William Colglazier, executive officer of the academies' National Research Council. One Iranian scientist told Science that two Iranian scientific academies have sent “official apologies” to Schweitzer, who was allowed to leave the country as scheduled on 7 December. But Schweitzer says he has not received the apologies, and Colglazier says the U.S. academies are still awaiting a formal response from the Iranian government.

    In a separate incident last week, Carl Ernst, a professor of religious studies at the University of North Carolina, Chapel Hill, ran into trouble with immigration officials at Tehran airport. Ernst was sent back to Dubai and had to wait 12 hours before getting permission to enter Iran, where he received a prize from the Institute for Social and Cultural Studies in Tehran.

    It's unclear whether the incidents are unrelated or opening salvos of a concerted effort to derail academic cooperation with the United States. “There are various interest groups who are unhappy about people-to-people relations such as S&T exchanges. As a result, there will always be attempts to jeopardize these exchanges,” says one Iranian scholar. Others say that the risk of incidents is especially high in the run-up to Iran's presidential elections in June 2009. “Tension is seen as beneficial by many conservatives in Iran,” says a second Iranian scholar. “Conservatives are mostly suspicious and some of them even dead set against the opening of Iran towards the West.” In a promising development, Science has learned that organizers of a spring school on quantum many-body theory, to be held in May at the University of Tehran, were told this week by Iranian officials that “formal documents for security assurances” for visiting scholars have been sent to relevant institutes.

    Grilled.

    Glenn Schweitzer was questioned for 9 hours in his hotel room.

    CREDIT: NAS

    Schweitzer has spearheaded the academies' 8-year effort to nurture scientific ties with Iran, including workshops on various topics and an exchange of science policy experts between the academies and Sharif University of Technology in Tehran. “Glenn has always operated in an absolutely open and transparent way,” says Norman Neureiter, director of the Center for Science, Technology and Security Policy at AAAS (Science's publisher). “It's most unfortunate he was treated this way. It doesn't bode well for the future of these relationships.”

    Schweitzer, who was traveling on a visa issued expressly for the meeting, says this was his first problem in 10 trips to Iran in the past decade. But it's soured him on future visits. “I hope this is more of a bump in the road rather than a derailment,” he says. “But I won't go back. I'll let others pick up the mantle.”

    A brief version of this story ran on Science's policy blog, http://blogs.sciencemag.org/scienceinsider

  3. RESEARCH ASSESSMENT

    U.K. University Research Ranked; Funding Impacts to Follow

    1. John Travis

    Researchers and university officials across the United Kingdom were eagerly poring over detailed statistics last month, figuring out how their departments fared in the latest government rankings of research quality. The occasion was the release on 18 December of the results* of the latest Research Assessment Exercise (RAE), a massive evaluation of 159 higher education institutions that cost more than £10 million to conduct and occupied some 1000 scientists who spent up to a year on peer-review panels.

    Although the RAE carefully avoided giving overall scores, U.K. newspapers and other publications immediately used the data to construct “league tables,” and schools began celebrating or lamenting their rise or fall since the last evaluation in 2001. “We're really chuffed. It's been fantastic,” says a spokesperson for Queen Mary, University of London, a relative unknown that unexpectedly shot up into the top 15 of the U.K.'s research universities in several ranking tables.

    More is at stake than reputation. The results will affect the distribution over the next 5 years of some £1.5 billion in annual funding from the government. Although U.K. universities and specialty research institutions obtain about half their budgets from competitive grants and programs, industry, and educational charities, the RAE determines the annual governmental block grant each institution can depend on for an extended period. But how the RAE data translate into cash won't be announced until next spring. “This is an exercise in how you allocate scarce resources, and that part is still undecided,” says Nick Dusic of the U.K.'s Campaign for Science and Engineering.

    This is the 6th RAE since 1986 and the last in its current form. The government intends to switch from using peer-review panels to a more quantitative approach. There have been few complaints about those doing the peer review—primarily U.K. scientists but also some from abroad—yet each new RAE tends to spark a controversy. In 2001, a furor ensued when the Higher Education Funding Council for England (HEFCE) decided to concentrate funding allocations on institutions receiving the top grades (Science, 21 December 2001, p. 2448). In contrast, Scotland decided to spread its money more equally, which was made easier by the smaller number of research outfits there.

    The 2001 RAE was one factor that led to the closure of many science departments, and universities have been preparing for the next RAE ever since. The University of Manchester and the University of Manchester Institute of Science and Technology reportedly even decided to merge to present a stronger case for research funds. There have also been grumbles that certain schools submitted only their best for evaluation, a stratagem that may raise their position in the league table but could backfire because funding allocations are usually based in part on the number of researchers whose work was submitted to RAE. The RAE “affects institutional behavior and individual behavior, though it's hard to tell how much game-playing is going on,” Dusic says.

    View this table:

    A big change in the 2008 RAE is that overall grades weren't awarded. Instead, institutions received a “quality profile” for each discipline—the percentage of research submitted that peer reviewers rated as world-class (4*), internationally excellent (3*), recognized internationally (2*), recognized nationally (1*), or unclassified. That more detailed evaluation, say school officials, should help them evaluate their strengths and weaknesses better. It also led to a lot of boasting about the overall quality of U.K. science, with HEFCE officials noting that 87% of U.K. research is of international quality, i.e., the top three grades.

    The next big controversy over the RAE concerns its death. In a new evaluation scheme, dubbed the Research Excellence Framework (REF), the U.K.'s higher education funding bodies plan to assess research quality using primarily measures such as competitive grants obtained, Ph.D.s granted, and citations received for papers. Many questions remain about whether this approach can truly evaluate research quality across disciplines that include music, midwifery, economics, philosophy, and all the traditional sciences.

    In preparation for the first REF, likely in 2014, U.K. officials are now running a test with about 20 U.K. institutions. They've also commissioned several studies, one evaluating whether interdisciplinary work would be at a disadvantage when measured by such metrics rather than by peer-review panels. (The conclusion: essentially, no.)

    Soothing some nervous institutions, those in charge of the REF have recently agreed that it will contain an element of peer review, although the exact role remains undefined. Rather than evaluating massive submissions of research results, a panel of scientists from each discipline may help pick the appropriate metrics—patents, performances, or citations, for example. “The REF will be much more of an evolutionary change than a radical departure from the RAE,” predicts Dusic.

    For now, with the 2008 RAE funding formula unknown and money allocations still up in the air until March, no one knows which science departments might close or who may be looking for a job. “There's not a lot they can do now except wait and see,” says Dusic. The RAE “has real consequences on people. That's why people take it so seriously.”

  4. RESEARCH FUNDING

    For Many Scientists, the Madoff Scandal Suddenly Hits Home

    1. Jennifer Couzin
    Casualty.

    An MIT center financed by the Picower Foundation will survive, but some of its funding is now in jeopardy.

    CREDIT: 2005 RICK FRIEDMAN/CORBIS

    The alleged Ponzi scheme created by Wall Street investor Bernard L. Madoff has touched many scientists, including dozens who received funding from a respected charitable group that was decimated when the fraud collapsed. The Picower Foundation announced last month that it would “cease all grant making effective immediately” after 20 years of operation. On its 2007 tax return, the foundation reported assets of nearly $1 billion. Its endowment was managed by Madoff.

    The Picower Foundation was especially generous to scientists, all of whom are now reeling from the sudden loss of millions of dollars in research money. “Of course, hearing on the news about Madoff and the people who lost millions or billions of dollars was a tragic thing … in abstract,” says J. Timothy Greenamyre, a neurologist at the University of Pittsburgh in Pennsylvania. Then, he says, he received an e-mail from Barbara Picower informing him that his and his colleagues' funding was over, “and it was suddenly terribly tangible and just devastating.”

    Greenamyre was one of seven senior Parkinson's disease researchers whose labs made up a consortium, now 5 years old, formed and funded by the foundation established in 1989 by Barbara and Jeffry Picower. In 2007 alone the seven together received about $4 million. One recipient compared the money to a prestigious U.S. National Institutes of Health (NIH) Pioneer Award; another said its value exceeded that of an R01, the bread-and-butter NIH grant on which many researchers rely. As recently as October, the Parkinson's group conferred with Barbara Picower and discussed expanding the consortium to include additional scientists.

    Along with its Parkinson's project, the foundation began financing a similar consortium in diabetes and obesity research just over a year ago, giving about $2.3 million in 2007 to five prominent researchers, including Jeffrey Flier, the dean of Harvard Medical School in Boston. All had been told that the funding would last at least 3 years; Flier, for example, was supposed to receive $1.5 million during that time. Now the funds have dried up, and researchers are trying to determine what, if anything, they have left over from grant money the Picowers already provided.

    The Picowers also extended their generosity to the Massachusetts Institute of Technology (MIT) in Cambridge; in 2002 they gave $50 million to form MIT's Picower Institute for Learning and Memory. That money has been paid, but an additional $4 million pledged to the institute in May is in jeopardy. It was intended to launch the Picower Institute Innovation Fund (PIIF), to support institute faculty members, postdoctoral fellows, and students pursuing creative or high-risk neuroscience research.

    The Picower Foundation had planned to provide $2 million per year for 2 years to PIIF, with the option of further funding if the science proved compelling, said Mark Bear, a neuroscientist and director of the Picower Institute for Learning and Memory, in an e-mail message. “The second $2 million installment is due in early 2009,” he wrote. “I am extremely disappointed that this exciting and innovative program is at risk.” It's not clear how or whether MIT can fill the money gap and keep PIIF running.

    The Picowers were also donating $200,000 a year for graduate fellowships at MIT, money that will likely vanish along with the foundation. They made numerous smaller donations to various disease foundations, such as $250,000 in 2007 to a mantle cell consortium at the Lymphoma Research Foundation in New York City.

    Consortia members say they were deeply distressed by the e-mail message they received on 19 December from Barbara Picower, in which she informed them of the calamity. They say it's not just the money they will miss but also the Picowers' rare commitment to high-risk research.

    “The Picower money was really transformative for us,” says D. James Surmeier, an electrophysiologist at Northwestern University in Evanston, Illinois, who was part of the Parkinson's group. The Picowers enabled his lab to chase a hypothesis that was “fairly speculative” about how Parkinson's disease damages neurons. Surmeier found that an influx of calcium into dopamine neurons makes the cells vulnerable in Parkinson's disease and more likely to die. A drug used to treat high blood pressure blocks this flow, and Surmeier thinks it may help Parkinson's patients, too. “They were interested in the big break,” says Surmeier of the foundation. “The loss of that money now will stop that research dead in its tracks.”

    All members of the Picower-funded Parkinson's and obesity consortia with whom Science spoke agreed that the groups would likely disintegrate. However, a clinical trial of a possible drug therapy for Parkinson's based on Surmeier's work, organized without funding from the Picowers, is set to begin shortly. Another worry for these researchers, many of whom said the Picower money amounted to 20% or more of their funding, is that they may have to lay off members of their labs. Several were reluctant to discuss this possibility, noting that lab members were off for the holiday and didn't yet know that the Madoff fraud had touched them directly.

    Barbara Picower, the foundation's president, did not respond to an e-mail message, and a foundation attorney declined to comment. But in a statement announcing that the foundation would close, Barbara Picower noted that it had distributed more than $268 million in grants since its inception in 1989 and expressed hope that its impact would endure.

  5. PLANETARY IMPACTS

    Did the Mammoth Slayer Leave a Diamond Calling Card?

    1. Richard A. Kerr

    In this issue of Science (p. 94), a group of nine researchers presents the latest evidence for a cosmic catastrophe just 12,900 years ago. Last year in the Proceedings of the National Academy of Sciences (PNAS), six of these authors—along with 20 others—made a wide-ranging case for a shower of exploding comet fragments over the North American ice sheet. Such a cataclysm could have wiped out mammoths and other large mammals, abruptly ended the Paleo-Indian Clovis culture, and triggered a millennium-long return to near-glacial cold. The idea met with skepticism in the community of impact researchers.

    Now, this group of nine is presenting the kind of evidence impact researchers have been asking for: transmission electron microscopy showing nanodiamonds from the geologic moment of the putative catastrophe. Unlike the last time, impact researchers are intrigued by the new evidence. But they aren't persuaded yet. “It may be a very great discovery, [but] I can't say this is wrong or right. We need a bit more evidence,” says high-pressure mineralogist Falko Langenhorst of the University of Bayreuth in Germany.

    Diamond has long been associated with impacts. Fragments up to a few tens of nanometers in size have been found in debris from the impact that killed off the dinosaurs. Impacting asteroids can carry diamonds that formed around distant stars. Impactors can shock graphite to diamond; diamond may also condense from carbon vapor in impact clouds. The PNAS paper claimed to detect such impact nanodiamonds using nuclear magnetic resonance, but specialists in the method roundly rejected that NMR evidence as erroneous (Science, 7 March 2008, p. 1311).

    The transmission electron microscopy (TEM) evidence is faring better, up to a point. In the Science paper, geoarchaeologist Douglas Kennett of the University of Oregon, Eugene, and his colleagues report the detection of nanodiamonds at six locations widely spread across North America. The nanodiamonds appear in a thin layer of sediment radiocarbon-dated to 12,900 years ago, often at the base of a distinctive “black mat” of organic-rich sediment. The black mat has been associated with the beginning of the millennium-long Younger Dryas cold interval, the end of the Clovis culture, and the last of the mammoths.

    The nanodiamonds described in the paper were identified using TEM and electron diffraction analysis. They were usually encased in millimeter-size, honeycombed spherules made largely of carbon that the group and others have found at the base of the Younger Dryas boundary layer but not above or below it. At the fall meeting of the American Geophysical Union in San Francisco last month, four of the Science authors reported finding abundant nanodiamonds at a total of 16 sites across North America and western Europe at the Younger Dryas boundary, but not just above or below it.

    Sky with diamonds.

    Rather than stardust, nanometer-size diamonds may be a product of the newborn sun.

    CREDIT: D. J. KENNETT ET AL., SCIENCE

    Opinions vary among nanodiamond experts. “The chances are good they have nanodiamonds,” says spectroscopist Peter Buseck of Arizona State University, Tempe. Physicist Tyrone Daulton of Washington University in St. Louis, Missouri, is less convinced. “Maybe they found diamonds, maybe they didn't,” he says. “It looks interesting, [but] there's not enough information in this paper to say whether they found diamonds.” Langenhorst agrees. The measured electron diffraction “d-spacings” used to infer a diamond crystal structure are not unique, he notes: “There are many materials that have that structure. It could be just a metal.”

    Even those persuaded that nanodiamonds have been found have reservations. “I'm convinced they're nanodiamonds,” says geochemist Iain Gilmour of the Open University in Milton Keynes, U.K., but “that doesn't necessarily make a connection to impacts. The fundamental thing is tying down where they come from.” Researchers analyzing impacted rock and ejecta find plenty of diamonds, many of which have been strewn far and wide to accumulate in geologically quiet places like lake bottoms, swamps, and the deep sea floor. But few researchers have searched for diamonds in such accumulation zones away from known impacts. That has left the natural history of nanodiamonds largely a blank.

    One exception is microscopist Dominique Schryvers of the University of Antwerp in Belgium and his colleagues. They found carbon spherules in almost every one of 70 samples from apparently undisturbed forest topsoils, grasslands, and swamps across Europe. The millimeter-size spherules—in soil estimated to be 1 or 2 millennia old—bear a striking resemblance to those found at the 12,900-year-old Younger Dryas boundary. In an early 2008 paper in Diamond and Related Materials, they describe nanodiamonds found in spherules from near Burghausen, Germany, and Spa, Belgium, though similar diamonds were found in selected samples from other areas. “We have no idea where the material is coming from,” says Schryvers. “There's no real proof of an impact. They could come from anywhere.”

    So once again the advocates of a Younger Dryas impact are hearing that they have yet to make their case. “They're a long way from being able to use [the nanodiamonds] to justify an impact,” says impact mineralogist Bevan French of the Smithsonian National Museum of Natural History in Washington, D.C. “I don't think you've got a unique impact marker” in nanodiamonds. But then, it took the better part of the 1980s for proponents of the dinosaur-killing impact to win that argument.

  6. CORAL REEFS

    Calcification Rates Drop in Australian Reefs

    1. Elizabeth Pennisi

    Wall Street isn't alone in suffering a steep downturn. A large-scale study in Australia's Great Barrier Reef has revealed that the rate at which corals absorb calcium from seawater to calcify their hard skeletons has declined precipitously in the past 20 years, slowing coral growth. The report, on page 116, provides empirical data that fuels concerns that increased carbon dioxide in the air is putting these diverse marine ecosystems at risk (Science, 4 May 2007, p. 678). “This study has provided the first really rigorous snapshot of how calcification might be changing” as the ocean temperature and acidity rise, says marine biologist Ove Hoegh-Guldberg of the University of Queensland in Australia. “The results are extremely worrying.”

    Corals start out as soft-bodied larvae that settle on hard surfaces, take on algal partners, and begin pulling dissolved calcium compounds out of the water to lay down a hard skeleton. A reef arises from whole colonies of corals laying down these secreted skeletons. To keep ahead of erosion, they must calcify quickly enough to make up for what's being lost due to wave action and other forces. But some coral experts worry that the expected doubling of the carbon dioxide in the atmosphere over the next 50 years—and the subsequent ocean acidification—could make keeping up next to impossible. Several laboratory studies have suggested that as seawater pH declines, so do coral calcification rates, although one recent experiment has not shown this effect.

    Glenn De'ath, an ecological modeler at the Australian Institute of Marine Science (AIMS) in Townsville, and his colleagues looked at archived coral samples for signs of such a slowdown. Between 1983 and 1992 and between 2002 and 2005, AIMS researchers had collected coral cores from 69 reefs spanning the length and breadth of the Great Barrier Reef. The coral sampled, Porites, grows over many decades, even centuries, into 6-meter-tall mounds. It lays down annual growth bands, making it possible to count back to specific years and correlate growth with sea surface temperature and other environmental data for the same period.

    Growth chart.

    Special lighting brings out the annual growth bands in this slice of coral, useful for studying the interplay between environment and calcification.

    CREDIT: GLENN DE'ATH

    The researchers sliced up cores and used x-rays and a technique called gamma densitometry to measure annual growth and skeletal density; from those two parameters they calculated annual calcification. In their first pass, De'ath, AIMS coral biologist Katharina Fabricius, and AIMS climatologist Janice Lough found a decline in calcification rates since 1990 in 189 colonies from 13 reefs. They then broadened the analysis to include a total of 328 colonies spanning coastal to oceanic locations, including 10 cores that dated back centuries.

    They found that calcification rates increased 5.4% between 1900 and 1970, but have dropped 14.2% from 1990 to 2005, mainly due to a slowdown in growth from 1.43 centimeters per year to 1.24 centimeters per year. “The very fact that the effect is seen on inshore as well as offshore reef sites rules out [the chance that] the observed decline has been due to declining coastal water quality,” says Hoegh-Guldberg.

    In the 1990s, researchers had predicted that ocean acidification might one day become a problem for corals, and at the 11th International Coral Reef Symposium (ICRS) in July 2008, global experts suggested that humans had about a decade to “get our act together” to reduce carbon dioxide emissions, says Clive Wilkinson, global coordinator of the Global Coral Reef Monitoring Network in Townsville. But calcification decline is “here and now and over the past decade, not some time in the future, as we predicted,” he notes. “This has been happening under our noses.”

    Problems are showing up beyond the Pacific. Anne Cohen, a marine biogeochemist at the Woods Hole Oceanographic Institute in Massachusetts, and her colleagues are using computed tomography scans to look at growth and calcification in Bermuda brain corals. There's been a 25% decline in both since 1959, they reported at ICRS. “On Bermuda, the change appears unprecedented at least through the past century,” says Cohen.

    However, the role of acidification in this decline is far from settled. In the laboratory, Alina Szmant, a coral physiological ecologist at the University of North Carolina, Wilmington, and her colleagues found that neither low pH nor a lowered calcium carbonate concentration (which results from increased acidity and is considered key to calcification) slowed coral growth. Instead, calcium bicarbonate proved key, her team reported at ICRS. She faults previous lab studies because they used hydrochloric acid, not carbon dioxide, to lower the pH of the water in the calcification studies. Hydrochloric acid and carbon dioxide have different effects on seawater chemistry and bicarbonate concentration, she says. Her conclusion: “It's not clear that carbon dioxide enrichment will have negative effects on calcification rates.”

    Cohen also has some reservations. Commenting on De'ath's work, she says “the timing is later, and the magnitude and rate of the calcification decline is greater than one might expect if seawater saturation state [acidification] were the primary driver.” Temperature and other environmental factors likely come into play.

    But whatever the cause, the new data are grounds for concern. In December, Wilkinson's group released a 4-year assessment, Status of Coral Reefs of the World: 2008, which said that 45% of the world's reefs were healthy with no significant threats from human activity on the horizon. But looks can be deceptive, says Cohen: “Even though the corals look healthy, some pretty dramatic changes are occurring beneath the surface.”

  7. SCIENCE ADVISERS

    Bending the President's Ear

    1. Eli Kintisch

    A science adviser is only as effective as the president wants him to be, say the men who have held the job since Sputnik.

    A science adviser is only as effective as the president wants him to be, say the men who have held the job since Sputnik

    Truth to power.

    Donald Hornig with Lyndon Johnson, and George Kistiakowsky with Dwight Eisenhower.

    CREDITS: BETTMANN/CORBIS

    On 7 August 1996, Jack Gibbons was facing a crisis. Gibbons, President Bill Clinton's science adviser, had been out of the country when news leaked that federal and academic scientists had discovered what looked similar to fossilized bacteria-like organisms in a meteorite that had fallen from Mars. NASA officials were recommending caution. But Dick Morris, an influential White House political adviser, was urging Clinton to announce that the United States would send astronauts to the Red Planet to search for signs of life.

    Gibbons quickly sent word to the president to let “science run its course.” Clinton agreed, and in a hastily arranged press briefing the chief executive emphasized robotic missions and the “fundamental” questions that remained about the meteorite's significance. Today, most Mars scientists believe it contains no signs of biological activity. Gibbons's intervention, says his press aide, Rick Borchelt, was key to “talk[ing] us off the ledge that Dick Morris wanted us on.”

    In discussing how decisions are made in the White House, Gibbons likes to quote Victor Hugo's adage that “science has the first word on everything, but the last word on nothing.” Even so, the Mars rock episode showed that the science adviser can influence national policy. To stay abreast of fast-moving events, Gibbons relied on a strong network within the West Wing. His team of scientific experts provided rapid analysis of NASA's views. Gibbons also had enough credibility among Clinton's senior advisers to get his message delivered in time. Those qualities, say former officeholders, are exactly what an adviser needs to be effective.

    Last week, President-elect Barack Obama selected John Holdren to be his science adviser and head of the Office of Science and Technology Policy (OSTP) within the White House (see p. 22). In doing so, Obama also pleased science mavens by saying that he would elevate Holdren to the position of assistant to the president, a title held by Gibbons but not by the current adviser, John Marburger. “Under Bush, the position has been marginalized,” says Rodney Nichols, former president of the New York Academy of Sciences and author of one of three recent reports urging the next president to take such a step.

    At the same time, however, insiders say that title is only one factor in determining a science adviser's effectiveness. It's just as important to be a savvy politician, an effective manager and convener, and a figure respected by both senior White House officials and the scientific community. “It requires grace and intelligence and sharp elbows,” Nichols says.

    Those qualities must be exercised within an environment set by the president's agenda and influenced by day-to-day events. D. Allan Bromley, adviser to George H. W. Bush, was given the opportunity to create the cross-agency climate science program. Marburger was asked to defend the policy adopted by President George W. Bush early in his term to limit federal research on embryonic stem cell lines. John F. Kennedy told his science adviser, Jerome Wiesner, exactly what would happen if funding fertility research at the National Institutes of Health, as Wiesner had urged, came back to haunt him. “I'll blame it on you,” Wiesner quoted the president in a 1980 memoir.

    Former officeholders also stress the importance of being a member of the president's team. Bringing one's scientific expertise to bear on an issue is not the same thing as arguing on behalf of the scientific community, they say. As George Keyworth, science adviser to President Ronald Reagan, put it in a 2006 speech, a new adviser must avoid at all costs the perception that he comes “with an agenda that might differ from the president's.”

    Ties that break

    Unlike the national security adviser or a Cabinet secretary, the science adviser holds little inherent power. But circumstances can give the officeholder an inside track to power. At the request of President Franklin Roosevelt, Vannevar Bush, director of the Carnegie Institution of Washington, famously proposed expanding federal support for science at the end of World War II, setting in motion a process that led to the creation of the National Science Foundation in 1950. After Sputnik, President Dwight Eisenhower appointed electrical engineer James Killian as the first full-time presidential science adviser.

    Early advisers served crucial roles in the rapidly evolving U.S. military and space enterprise. But many observers say the position has declined in influence over time. Roger Pielke Jr. of the University of Colorado, Boulder, wrote last year in Nature that modern presidents are “hardly in need of more science advice,” now that the government receives guidance each year from an estimated 18,000 outside experts.

    So what makes a science adviser effective? Former White House staff members say personal ties, like Gibbons's link to Vice President Al Gore, can help. Likewise, Bromley, a physicist at Yale University, Bush's alma mater, served on various White House advisory committees while Bush was vice president under Ronald Reagan. When Bush later asked him to become his science adviser, Bromley extracted a promise of support and Oval Office access when necessary. “Because of Bromley's proximity to the president, Cabinet-level people came to his meetings,” says Damar Erb, his executive assistant.

    CREDITS (TOP TO BOTTOM): NASA; NASA; DAVID HUME KENNERLY/WHITE HOUSE; USMDA; WIKIPEDIA; NASA

    In contrast, Lee DuBridge had been a respected adviser to the Pentagon before being tapped by President Richard Nixon. But Nixon's inner circle saw him as a water carrier for the scientific establishment and excluded him from budget meetings. That wasn't his only liability, however. In 1970, IBM physicist Richard Garwin, a member of the advisory group that preceded the President's Council of Advisors on Science and Technology, spoke out against the Administration's backing of research into supersonic flight. DuBridge's public support of the program was deemed by White House aides to be insufficiently forceful, says former Science reporter Dan Greenberg, and he was forced to resign months later.

    Edward David, his replacement, also had a short and rocky tenure. “Ed David … [says] just to give another half-billion to the National Science Foundation. That isn't going to do a goddamn bit of good,” Nixon told his advisers in a conversation recorded in June 1971 and described by Greenberg in his 2001 book, Science, Money, and Politics. After Nixon's landslide reelection the following year, he eliminated the post of science adviser, its staff, and advisory committee. (Congress passed legislation to reinstate the office in 1976, a move led by President Gerald Ford.)

    But staying in the president's good graces isn't always enough, either. To be effective, a science adviser also needs chutzpah, says Clinton adviser Neal Lane, who recalls a meeting to discuss upcoming presidential initiatives to which he wasn't invited. Lane showed up anyway and persuaded an aide to put a name card for him on the table. “It was a very important meeting,” says Lane, who recalls Clinton asking him about a proposed nanotechnology initiative that ended up in the president's next budget request to Congress.

    Former Marburger staffer Pat Looney says Marburger earned a reputation in the White House as an “honest broker” on issues including coordinating astrophysics experiments and competitiveness legislation. He did it by not “telling [agencies] how to run their business” and working well with White House budget officials.

    But that influence only extended so far. Last year, when Julie Gerberding, head of the Centers for Disease Control and Prevention, submitted congressional testimony on climate change to the White House for review, it was watered down. Marburger's staff said their views had been ignored, and the culprit was later revealed to be Vice President Dick Cheney. “Having the OSTP comments just be one among many seems to not be giving scientific input the priority it deserves,” says Michael MacCracken, a former White House climate change official.

    Former advisers differ on how broadly they should try to spread their tentacles. Bromley, regarded as one of the most effective science advisers in recent times, described in a 1994 memoir how he focused on a “small number of areas … that were of particular national importance.” Bromley restructured the Federal Coordinating Council for Science Engineering and Technology and won the president's support for a rule stating that agencies had to send their top officials “or they were not represented at all.” The council helped Bromley coordinate federal climate research and materials science as well as launch initiatives on international science and science education.

    Gibbons greatly expanded the council, however, arguing that “you have to coordinate somehow.” But Professor Christopher Hill of George Mason University in Arlington, Virginia, believes Gibbons went too far. “The agencies were just overwhelmed with committee work,” Hill says.

    There's no question that a successful science adviser needs a strong staff. But experts disagree on just what that means. The report that Nichols co-authored, from the Woodrow Wilson International Center for Scholars, recommended the traditional complement of four Senate-confirmed deputies. That's double the number Marburger believed sufficient when he set up his office in 2001. Eliminating an environmental deputy weakened the White House's ability to coordinate climate change research, says Richard Moss, the White House climate program director under Clinton and, until 2006, for Bush (Science, 10 October 2008, p. 182). But Marburger blames the timidity of the program itself, saying that OSTP and the White House budget office “were willing to support a much bolder reallocation of resources than [the program] ever recommended.”

    Obama's pledges and appointments to date raise the hopes of science policy watchers that their concerns will be addressed in his administration. But if history is any guide, Holdren's effectiveness will also be determined by his ability to navigate the political winds swirling about the White House.

  8. BRAIN CANCER

    A Viral Link to Glioblastoma?

    1. Greg Miller

    Circumstantial evidence hints that cytomegalovirus, a common herpesvirus, may play a role in the aggressive brain cancer, but big questions remain.

    Circumstantial evidence hints that cytomegalovirus, a common herpesvirus, may play a role in the aggressive brain cancer, but big questions remain

    Delicate operation.

    Charles Cobbs removes a brain tumor at California Pacific Medical Center.

    CREDIT: LACY ATKINS/SAN FRANCISCO CHRONICLE

    SAN FRANCISCO, CALIFORNIANeurosurgeon Charles Cobbs is trying to remove as much of a brain tumor as possible without damaging tissue important for speech. He needs the patient's cooperation, so the anesthesiologist dials down the propofol anesthetic and places a laptop computer on a stand near the patient's head. The 77-year-old man, who has a glioblastoma near the crease that separates the frontal and temporal lobes, slowly wakes up.

    Images of familiar objects flash on the screen, and the patient names them while Cobbs electrically stimulates his exposed cortex with a hand-held probe. “Cow, car, hairbrush,” the man says. He sounds groggy, but the words are unslurred. Cobbs turns up the current for stronger stimulation and does another pass as the man counts to 50. Satisfied that the tumor can be safely resected, the surgical team here at the California Pacific Medical Center puts the patient back under, and Cobbs scoops out a bloody glob of tissue, most of which goes into a vial to be sent to the lab.

    After the surgery, Cobbs is optimistic. Glioblastoma is the worst grade of malignant glioma, a relatively uncommon but deadly class of brain tumors that kills 97% of those diagnosed within 5 years. But Cobbs says the material he removed had the rubbery feel of dead tissue rather than the “grape jelly” consistency of a growing tumor. That would be good news for the patient, and Cobbs thinks it may also bode well for his own unorthodox hypothesis about malignant glioma.

    Cobbs thinks the cancer may be caused, or at least spurred on, by cytomegalovirus (CMV), a common herpesvirus. Several years ago, Cobbs reported that malignant gliomas are rife with CMV, and he says work by other researchers supports his idea that the virus may be a potent promoter of these cancers. His patient has been taking an antiviral drug called Valcyte, made by Roche Pharmaceuticals, in addition to receiving standard radiation and chemotherapy. Cobbs thinks drugs and other therapies that target CMV may extend the lives of patients who otherwise have few options.

    The idea is controversial and unproven. Why would such a common virus cause cancer in only a small subset of those infected? And why have test tube experiments so far failed to show that CMV transforms healthy cells into cancerous cells? These and other nettlesome questions beg for answers before Cobbs's theory can gain widespread acceptance.

    Yet several researchers and clinicians who attended a meeting Cobbs organized in Boston last October say there is enough circumstantial evidence in support of the idea to merit serious attention. “It's a promising concept, and it gives us hope of new treatment strategies for this horrible disease,” says Cecilia Söderberg-Nauclér of the Karolinska Institute in Stockholm, Sweden. She and others have already begun small clinical trials with drugs and immunotherapy directed at CMV.

    Counter to the mainstream

    At age 45, Cobbs still has the rangy build of a collegiate rower, which he was at Princeton University. Soft-spoken and bespectacled, he doesn't come across as a rebel, although his thinking about malignant glioma has run counter to the mainstream. Cobbs did research as a medical resident suggesting that these tumors were “boiling cauldrons of inflammatory molecules,” and he wondered whether this inflammation might somehow trigger the cancer or help it grow. It wasn't a fashionable idea at the time, and for a young investigator, it was a risky avenue to pursue.

    Reading Surely You're Joking, Mr. Feynman!, a memoir written by the iconoclastic physicist Richard Feynman, boosted his confidence. Cobbs says the message he took away from the book is that “your job as a scientist is to think for yourself and question what the herd is thinking.” Cobbs had a hunch that CMV might be just the sort of virus to lurk around and cause the kind of chronic inflammation he suspected of contributing to cancer. The virus infects for life, and mouse studies have found that it infects the neural stem cells thought to give rise to glioblastomas. To investigate, he and colleagues tested 27 malignant glioma samples for CMV proteins and genetic material. All were positive, the researchers reported in Cancer Research in 2002. Samples of noncancerous brain tissue tested negative.

    Not everyone was convinced by the findings. In 2005, a team of researchers at City of Hope Hospital in Duarte, California, reported in Modern Pathology that they'd failed to find CMV in glioma samples. “I was kind of distressed and heartbroken,” Cobbs recalls. A team at Duke University Medical Center in Durham, North Carolina, wasn't finding CMV in its samples, either. “We called [Cobbs] and said, ‘What's the deal?'” says Duane Mitchell, a neuro-oncologist at Duke. Cobbs suspected that the other researchers were using less sensitive methods and offered to visit both teams to demonstrate his technique. Only the Duke team took him up on it.

    Since then, there have been several confirmations. Last February, Mitchell and colleagues reported finding CMV in more than 90% of glioma samples they examined; a team at M. D. Anderson Cancer Center in Houston, Texas, published similar findings a few months later; and Söderberg-Nauclér and colleagues have a forthcoming paper as well. “Without a question, the virus is there,” says Söderberg-Nauclér. But is it a bystander or an agent provocateur?

    A suspicious resume

    CMV is one of eight human herpesviruses; it infects at least half of the population in developed countries and nearly everyone in developing countries, where poor sanitation and hygiene encourage its transmission. Although it generally doesn't cause problems in healthy adults, CMV is a common cause of birth defects, and it can cause a host of serious problems in immunocompromised people.

    How it might trigger or abet cancer is not known, but there's no shortage of possibilities, says virologist Jay Nelson of Oregon Health and Science University in Portland. One way it could do this “is by preventing the immune system from taking the tumor out,” he posits, because CMV is adept at manipulating signaling molecules on cell surfaces that would otherwise flag infected cells for destruction. Another possibility is that the virus helps sustain tumors by promoting angiogenesis, the growth of new blood vessels. Nelson's lab reported last year that CMV-infected heart cells churn out a slew of growth factors and other compounds that spur angiogenesis.

    The virus also makes proteins that interfere with cellular machinery that normally prevents tumors, says Robert Kalejta, a cancer biologist and virologist at the University of Wisconsin, Madison. In 2003, Kalejta and colleagues reported that a CMV protein called pp71 helps break down retinoblastoma protein (Rb), a potent tumor suppressor. More recently, they identified a CMV enzyme called UL97 that inactivates Rb (Science, 9 May 2008, p. 797). “Clearly, these are two proteins that can do what's required” to transform healthy cells into tumor cells, Kalejta says. “The question is, are they really doing that in these cancers?”

    Kalejta says he's intrigued but not yet convinced. He notes that unlike proven cancer-causing viruses such as human papilloma-virus and Epstein-Barr virus, CMV has not been shown to transform healthy cells into cancerous cells. The lack of direct evidence that CMV can cause cancer remains an important caveat, Kalejta says.

    Liliana Soroceanu, a neuroscientist who collaborates with Cobbs at the California Pacific Medical Center, hopes to gain clues about how CMV might contribute to malignant glioma by culturing cells from tumor tissue Cobbs has removed from patients. She plans to use a CMV gene chip developed by Nelson's team to examine which genes the virus turns on in infected glioma cells—and to see whether turning those genes off, using RNA interference, for example, slows the growth of cultured tumors.

    Partners in crime?

    Cobbs suspects that malignant gliomas like this one may be aided by cytomegalovirus.

    CREDIT: LACY ATKINS/SAN FRANCISCO CHRONICLE

    Bedside stories

    Evidence of a link between CMV and glioma has come from the bedside as well as the lab bench. One serendipitous finding comes from a clinical trial with glioblastoma patients run by Robert Prins and colleagues at the University of California, Los Angeles. They have been testing a vaccine intended to train patients' immune systems to attack their tumors. To do this, they grow immune cells from each patient in culture, expose them to proteins from that patient's tumor, and inject them back into the patient.

    As it happens, one patient responded to the vaccine by mounting a massive immune response against a CMV protein, the researchers reported last July in The New England Journal of Medicine. That patient has now survived nearly 6 years without a recurrence of the cancer, longer than 99% of patients who receive standard care, Prins says. (All patients in the trial received standard care—surgery, radiation, and chemotherapy—in addition to the vaccine.) Although it's impossible to know if the vaccine is the reason, given the slim odds of surviving that long with glioblastoma, Prins thinks it's highly likely.

    At Duke, Mitchell and colleagues are pursuing a similar approach more tightly focused on CMV. They are vaccinating glioblastoma patients with immune cells exposed to CMV proteins. The trial is intended to assess the safety of the treatment, but there are signs that it is having the desired effect. At November's meeting of the Society for Neuro-Oncology, the researchers reported that the vaccine had enhanced immune responses against CMV, and the 21 patients in the trial had a median survival time of more than 20 months. The median survival time so far for such patients is typically less than 15 months, Mitchell says.

    At the Karolinska Institute, Söderberg-Nauclér and colleagues have just completed a 2-year trial with Valcyte, the antiviral drug Cobbs's patient received. (Cobbs prescribed the drug off-label, not as part of a clinical trial.) The 42 patients with glioblastoma began taking Valcyte after surgeons had removed their tumors and researchers confirmed that the tumors were infected with CMV. Researchers used magnetic resonance imaging to look for signs that tumors grew back. The study is double-blinded, so the researchers don't yet know which patients received the drug and which received a placebo. Even so, Söderberg-Nauclér is optimistic. “We have patients that looked extremely bad at the beginning that have survived 2 years,” she says. The trial concluded just before Christmas, and the team should know soon whether those patients received the drug—or just a remarkable stroke of good luck.

    Cobbs says he's hopeful that if the Swedish trial goes well, Roche will fund a larger one. He insists he doesn't want to hype the evidence for a link between CMV and glioma but says he's gratified that other researchers are starting to take the idea seriously. “It may be a house of cards, but my gut tells me it might pan out to be something big time.”

  9. POLAR SCIENCE

    A Death in Antarctica

    1. Jeffrey Mervis

    The death in 2000 of a young Australian astrophysicist at the U.S. South Pole station raised many troubling questions. Eight years later, there are few answers.

    The death in 2000 of a young Australian astrophysicist at the U.S. South Pole station raised many troubling questions. Eight years later, there are few answers

    Into the darkness.

    The setting sun in late March heralds 6 months of winter for those at South Pole's Amundsen-Scott Station.

    CREDIT: BRIEN BARNETT/NSF

    On the last day of his life, Rodney Marks woke up vomiting blood. The 32-year-old postdoc was wintering over at the U.S. Amundsen-Scott South Pole Station, where he was operating AST/RO (Antarctic Sub-millimeter Telescope and Remote Observatory) for the Harvard-Smithsonian Center for Astrophysics. Over the next 10 hours, Marks made three trips to see Robert Thompson, the station's doctor, becoming increasingly anxious, disoriented, short of breath, and pained. Then he went into cardiac arrest. After attempting to resuscitate Marks, Thompson pronounced him dead at 6:45 p.m. on Friday, 12 May 2000.

    Outside, the temperature was −62°C. The station would remain in the throes of the brutal Antarctic winter and its 24-hour nights for another 5½ months. Addressing the 48 scientists, construction workers, and service personnel at a hastily called meeting, Thompson explained that Marks had died of unknown but natural causes. With no way out until November and with plenty to do, each of the winterovers mourned the sudden loss of someone who had enriched the tight-knit community with his keen intellect, bohemian ways, and outgoing personality. Then they went back to work, leaving their fallen comrade to be preserved in storage by the inhuman cold.

    On 30 October, after flights resumed between Antarctica and New Zealand, Marks's body was taken out of storage and flown to Christchurch, New Zealand, on its way to burial in his native Australia. In mid-December, Martin Sage, a forensic pathologist in Christchurch, delivered another shocker: Marks, in apparent good health, had died of methanol poisoning. In dispassionate prose, Sage described how Marks had consumed approximately 150 milliliters of a colorless and slightly sweet-tasting liquid, commonly known as wood alcohol, under unknown circumstances. By the time Marks visited the base's rudimentary medical center, his system had converted the methanol—used routinely at the pole to clean scientific equipment—into formic acid, leading to the acute acidosis that caused his symptoms. The source of the methanol, Sage reported, “is not apparent from the accounts given to date,” adding that “there is a distinct possibility” Marks may not have known that he was drinking methanol.

    The new information in the autopsy was a revelation to colleagues, who had assumed his death was caused by a massive stroke or heart attack. It spawned a fresh set of troubling questions. Had Marks drunk the methanol intentionally? If so, why would he have wanted to kill himself? If the ingestion was an accident, how had it happened? If deliberate, had someone spiked his drink or switched glasses without his knowledge?

    “I can't imagine how he could have drunk it,” says Antony Stark, an astronomer at the Smithsonian Astrophysical Observatory and principal investigator for AST/RO, which is funded by the U.S. National Science Foundation (NSF). “I cannot believe he committed suicide. He had friends. He had a fiancée; the work was going well; the instrument was doing fine.”

    It would be nearly 8 years before the New Zealand government, in the person of coroner Richard McElrea, would deliver an official statement about what had happened. However, the coroner's report, published in September 2008, answered none of those questions—and raised several troubling new ones. “I formally record that Rodney David Marks … died as a result of acute methanol poisoning, the methanol overdose being undiagnosed and probably occurring 1 to 2 days earlier, …” McElrea begins the last paragraph of his 50-page report, echoing Sage's autopsy. Then the coroner jammed all the outstanding issues, still unresolved, into the last half of that grammatically challenged sentence. “[Marks] being either unaware of the overdose or not understanding the possible complications of it, the medical assistance to him being compromised by an Echtachem [sic] blood analyzer being inoperable, death being unintended.”

    The report was packed with fresh details from dogged police work and hearings held in 2000, 2002, and 2006. But it leaves unanswered why a healthy and apparently happy young scientist consumed a lethal amount of a known poison. One reason for the continuing mystery is the nature of the care that Marks received. A report by a team of physicians who reviewed Thompson's medical notes weeks after Marks died concluded that “additional laboratory investigation” and other analyses “were warranted” in treating Marks that day. The review, led by chief medical adviser Gerald Katz of Raytheon Polar Services (RPS) Co. of Centennial, Colorado, NSF's polar logistics contractor, also stated that several tests not performed “were available at the time and would have been helpful in narrowing the diagnostic possibilities.”

    Another contributing factor was the isolated locale. McElrea's report acknowledges the “limitations” posed by conducting an investigation from a distance of 5000 km and with little firsthand evidence available. Still, he says, there were steps that could have been taken. “The scene could have been preserved for photographs … and initial statements could have been obtained from all relevant personnel,” he writes. “Very little of this process happened,” he notes. He also cites “legal, diplomatic, and jurisdictional hurdles” erected by the U.S. government that delayed his inquiry. In most cases, the relevant agency was NSF, which is responsible for all U.S. scientific activity on the frozen continent, and RPS.

    Paul Marks, Rodney's father, thinks that the U.S. government assigned the case a low priority because of his son's Australian citizenship. “If it had been one of yours, a U.S. citizen,” Marks told Science recently, “I can't believe that the FBI wouldn't have been involved from the start and that no stonewalling would have occurred.”

    A final obstacle was the 7-month gap between the astronomer's death and the autopsy. Although the report from the team of physicians, submitted in July 2000, noted that “there is no evidence to point to homicide, accidental poisoning, environmental toxicity, or infection,” McElrea says its conclusion was premature because the autopsy finding methanol as the cause wasn't released until December. “I respectively [sic] disagree that accidental poisoning and even foul play can be adequately disregarded without a full and proper investigation … with proper protocols for preservation and recording of evidence.”

    A scientific life.

    Australian Rodney Marks worked on the SPIREX infrared telescope in 1997–98 and returned 2 years later to operate AST/RO.

    CREDITS: D. A. HARPER; (INSET) CENTER FOR ASTROPHYSICAL RESEARCH IN ANTARCTICA

    Popular in purple

    Rodney Marks grew up in a small coastal town in the southern state of Victoria. By age 7, he was doing crossword puzzles with the help of a thesaurus. A scholarship to a prestigious private school in nearby Geelong fed his budding interest in math and science, which he pursued at the University of Melbourne. In 1993, he enrolled in a Ph.D. program in astronomy at the University of New South Wales.

    By all accounts, Marks enjoyed shattering stereotypes. Astrophysicist Gene Davidson, a New Zealand native who wintered over to operate another telescope the same year Marks died, recalls meeting the bearded, rangy, 6'2” free spirit in the mid-1990s at a session for graduate students during the annual meeting of the Astronomical Society of Australia. “He didn't look like a typical scientist. He had long hair and [dressed] Goth, with black fingernails. He stood out,” says Davidson, now a scientist at Australia's research nuclear reactor in Sydney. Marks, who had dyed his hair purple during the winterover, also played guitar in a heavy metal band, The Changelings, that performed from the South Pole during a global celebration on 1 January 2000 marking the new millennium.

    Marks first wintered over in 1997–98, caring for an infrared telescope called SPIREX and using some of the data in his thesis. But he also liked sharing his passion for science. On Wednesday evenings, for example, Marks gave a series of 1-hour introductory lectures on astronomy to the entire base. In addition to being educational, the talks helped bridge the gap between scientist and layperson. “Rodney was a very popular person in the community,” recalls Darryl Schneider, a physicist who was wintering over that year to maintain the Antarctic Muon and Neutrino Detector Array.

    But science wasn't the only thing in Marks's life. On many nights, he'd hang out in the galley, socializing first with the nonsmokers, who left early, and then with the smokers, who tended to arrive later and linger until the wee hours. In his spare time, he provided free French lessons. (Michael Ashley, his thesis adviser, had recommended a short-term project at the University of Nice that required a knowledge of French. Marks, who spoke not a word of the language, told him, “Okay, sounds interesting; I'll do it.” Within a few months, Ashley recalled at a memorial service, Marks was fluent.)

    Marks's winterover in 2000, his third tour at the pole, required him to coordinate experiments being done remotely on AST/RO and to collect data on viewing conditions. “He was running the whole instrument, and he was doing a very good job,” says Stark, who has made 21 trips to Antarctica but never wintered over at the pole. “All our winterovers have gone on to good academic jobs, and I'm sure that Rodney would have gotten one, too. He was an excellent scientist.”

    Stark and others can't imagine that Marks would have jeopardized such a bright future by knowingly ingesting methanol, and Stark says the toxic liquid wouldn't have been just lying around. It was typically used in January when the cryogenic parts of the telescope were being cleaned and reassembled, he notes, not after the austral winter had set in and the instrument was in use. In any event, the bottles containing methanol were clearly marked and kept in a locked cabinet.

    The coroner's report offers no eyewitness accounts of how Marks swallowed the methanol. But it contains speculation from those who knew him about whether his drinking habits were to blame. Will Silva, a Seattle, Washington-based physician who has wintered over at the South Pole three times and had gotten to know Marks the previous season, testified that Marks “was a steady sort of bloke who drank to excess on occasion” but who had a “high tolerance for alcohol.” (Silva was working at Palmer Station on the Antarctic peninsula the year Marks died and was one of the doctors who reviewed Thompson's notes.) Davidson says that his friend “tended to be a binge drinker, but so were a lot of people. Rodney certainly wasn't an alcoholic. He didn't need alcohol to get through the day.” Dr. Thompson testified in November 2000 that he “was strongly leaning toward alcohol withdrawal and anxiety as contributing factors” when Marks came to the clinic on the day he died.

    A mechanical mystery

    Thompson's initial diagnosis never raised the possibility of methanol poisoning. In part, that's because he didn't perform a test that might have tipped him off. And the reason for that omission is another element of the case that troubles Marks's friends and relatives.

    In setting up the clinic after he arrived in November 1999, Thompson found that a machine called an Ektachem, which can measure a patient's blood chemistry, needed to be recalibrated every time it was turned back on. “It was an 8- to 10-hour process once it went down,” he testified, and Thompson said he “was too busy providing critical care to Rodney” once Marks arrived on 12 May to take the time needed to do it.

    A scientific haven.

    The AST/RO building (top) was one of many instruments operating in South Pole's dark sector, an area free of radio-wave and light interference, in this 2003 photo.

    CREDITS (TOP TO BOTTOM): THOMAS NIKOLA/CORNELL UNIVERSITY, DEPARTMENT OF ASTRONOMY; SGT. LEE HARSHMAN/U.S. AIR FORCE/NSF

    What Thompson didn't know is that the problem was due to the failure of a lithium battery that allows the machine to maintain its electronic memory after it's turned off. Had the Ektachem been kept running, it would have been available for immediate use after Marks showed up even though its battery was dead. (The battery didn't affect how the machine performed once it was calibrated.)

    Thompson also testified that the machine was difficult to use, unreliable, and that the contractor was responsible for maintaining it. Not so, says Silva. Operating and maintaining the machine “is quite straightforward,” he told the coroner in 2006. He also explained that the manufacturer, Ortho Clinical Diagnostics, offers comprehensive online and free telephone technical support to deal with any problems. The coroner tried unsuccessfully to contact Thompson to invite him to respond to Silva's testimony on this and other points; the physician's current whereabouts are unknown.

    Whether the machine could have saved Marks is debatable. Sage, the forensic pathologist, testified that Marks's “chances of survival would have been considerably greater” with a timely diagnosis. Silva is less sanguine, having testified that “he very much doubted” that the standard treatment of infusing a 10% ethanol solution “could have succeeded given the magnitude of Rodney's intoxication.” The coroner's report sides with Sage, concluding that “the ektachem analyzer, if operational on the day, could well have led to an analysis of methanol poisoning, with the chances of his survival being considerably enhanced.” And McElrea blames Thompson for its unavailability. “It was his responsibility to keep it calibrated,” he writes. Leaving aside whether the treatment would have worked, Paul Marks and others argue that fingering methanol immediately would also have likely triggered a more thorough investigation at the scene.

    Silva, who no longer works for the Antarctic program, was one of two persons whom the coroner singled out for praise in his report. (The other is Harry Mahar, a former NSF health and safety officer now working at the U.S. State Department.) Both men provided “meaningful evidence” on several matters that NSF officials had declined to discuss, the report notes.

    Explaining why he agreed to testify, Silva told Science that “Rodney was one of our mates.… I did what I did because I think it's important to shed light on what happened when something goes amiss. I fancy that had NSF or RPS done some investigation and made it available to the authorities, the coroner probably would not have felt the need to become involved. [But] it appeared to us that there had been no substantive investigation.”

    Hurdles to clear

    Why did the coroner's investigation take so long? McElrea says he gave it his best shot. “The New Zealand police carried out as effective an investigation as was possible given the legal, diplomatic, and jurisdictional hurdles that arose over a number of years,” McElrea writes.

    McElrea's report notes that NSF never gave him a copy of the July 2000 Katz report that reviewed Thompson's medical records. (McElrea eventually obtained it, however, and attached it to his report.) In addition, McElrea describes how it took his chief investigator, Detective Senior Sergeant Grant Wormald, nearly 3 years to obtain information from Marks's co-workers at the pole after he sought the cooperation of NSF. NSF finally agreed to distribute a voluntary questionnaire to them but attached several strings. NSF officials vetted the content of the questionnaire “to assure ourselves that appropriate discretion has been exercised.” Once the questionnaire passed muster, it was mailed out by RPS, with a note saying that participation was voluntary. The police heard back from only 13 of the 49 co-workers.

    McElrea also says that unspecified “procedural reasons” foiled repeated efforts by Wormald to contact Thompson for a follow-up interview. In addition, the report includes Wormald's testimony in 2006 that “despite numerous requests, [he] was not entirely satisfied that all the information about investigations made by RPS or NSF has been disclosed to the New Zealand police or coroner.”

    McElrea's report doesn't address why those hurdles were thrown in his path, and he has declined further comment. “My role as a judicial officer is complete on the giving of findings and it is not appropriate that I discuss or comment on the case further,” he emailed Science in late November.

    However, it's possible that McElrea's own workload may have contributed to the slow pace of the investigation. Until last year, the coroner's job was a part-time position for McElrea, a partner in a large law firm in Christchurch. McElrea was also in the midst of writing a book when he claimed jurisdiction over the case after Marks's body was flown to Christchurch. Oddly enough, the book describes the heartbreaking saga of a group of men trapped in Antarctica while laying in supplies for Ernest Shackleton's aborted land crossing in 1914. Polar Castaways was published in 2004.

    NSF officials say the foundation has shared all appropriate materials with New Zealand authorities. Karl Erb, director of NSF's Office of Polar Programs, wrote in September 2005 to Neville Mathews, New Zealand's police representative at its embassy in Washington, D.C., that the Katz report dealt “only with medical aspects of the case in an effort to determine the cause of death and whether any action to protect other personnel at South Pole Station was required.” (The report, issued 5 months before the methanol poisoning became known, concluded that “no definitive diagnosis could be ascertained from the available data.”) The review, he added, “[is] therefore of little value to your inquiry.” Erb insists that NSF was attentive to requests from New Zealand for help, pointing to the coroner's statement in his report that he “acknowledge[s] the cooperation of NSF with the inquiry over several years.”

    Karl Erb, Director, Office of Polar Programs, NSF CREDIT: PHOTO BY SAM KITTNER FOR NSF

    As to whether additional evidence should have been gathered at the scene, Erb told Science this fall that the South Pole “is a working environment. It would not have been practical to cordon off the area. It's a very small place, and every part of it is in constant use.”

    Erb says he deeply regrets what happened: “It's a tragic, tragic event. And I have so much sympathy for his parents and family.” However, he doesn't think that NSF could have done anything differently. “If the coroner had had any reason to suspect foul play, he would have told us, and we would have contacted the Justice Department,” Erb says. “But we were assured 8 years ago that there was no evidence of foul play.”

    No resolution

    Eight years after the tragedy, residents at the South Pole live and work in a new $150 million station that was dedicated earlier this year. The medical quarters where Marks spent his last hours have been replaced by a modern medical facility, with telecommunications equipment that allows specialists in the United States to guide the station physician in carrying out diagnostic and therapeutic procedures. The living quarters, although hardly plush, are decidedly roomier. The science is booming: AST/RO and its 1.7-m mirror, for instance, has been succeeded by the 10-m South Pole Telescope, the largest ever deployed at the pole, which began making millimeter-wavelength observations last year.

    As a decade once filled with promise for his talented son winds down, Paul Marks doesn't hold out much hope of getting to the bottom of what transpired. “After so long, it's probably impossible to ever know what happened and if he died by sinister means or by accident,” he says. “That's something we have to live with.”

    McElrea's report says that Marks's death points to a flaw in a system that governs the behavior of all nations that operate in Antarctica. In his sole recommendation to the New Zealand government, he says that the “partial outcomes” in this case “point to an urgent need to set comprehensive rules of investigation and accountability for deaths in Antarctica on a fair and open basis.”

    Paul Marks also believes that there are lessons to be learned from his son's death. “The overall management system, and the way NSF and Raytheon behaved that allowed this to happen, that's something that should be addressed,” he says. “People will find ways to do bad things. But things should never have reached the point at which somebody could drink a tainted liquid.”

  10. NSF Rethinks Its Digital Library

    1. Jeffrey Mervis

    A $175 million investment by the U.S. National Science Foundation has fostered collaboration and created vast amounts of material. But the digital world is changing.

    A $175 million investment has fostered collaboration and created vast amount of material. But the digital world is changing

    CREDITS: PHOTO MONTAGE: N. KEVITIYAGALA/SCIENCE (PHOTOS, TOP TO BOTTOM) JUPITER IMAGES; MAX CROPPER/ UTAH STATE UNIVERSITY

    The Web is a double-edged sword for teachers. Linda Lai has seen it deliver wonderful answers to the toughest questions posed by her third- and fourth-grade students at Edith Bowen Laboratory School in Logan, Utah. But separating the wheat from the vast amount of chaff on the Web takes time. Lai also worries that her students may be exposed to inappropriate material as they search for knowledge.

    Mimi Recker, a professor of instructional technology at Utah State University in Logan, which runs the kindergarten through grade-5 lab school, knows that the Web poses many challenges for teachers. That's why she asked the U.S. National Science Foundation (NSF) to fund development of a Web-based tool to help teachers find, manage, and manipulate high-quality educational materials for use in the classroom. The software, called Instructional Architect (IA), is one of hundreds of research projects funded by NSF's National Science, Mathematics, Engineering, and Technology Education Digital Library (NSDL) program.

    NSDL was launched in 2000 to help scientists and science educators tap into the rapidly expanding online world. Since then, the foundation has spent about $175 million “to provide organized access to high quality resources and tools that support innovations in teaching and learning at all levels.” In practice, that has meant three things: creating and maintaining a Web site (nsdl.org) with a vast assortment of peer-reviewed materials, including lesson plans, videos, lectures, examples, and teacher guides; providing support for more than a dozen disciplinary and sector-based portals, called Pathways, that offer suitable content to NSDL; and funding individual research projects, such as IA, that are aimed at helping researchers and educators make better use of online learning. [AAAS, which publishes Science, has received $5.3 million since 2000 to oversee the BiosciEdNet (BEN) Pathway.] The money also goes toward administration, outreach, and the support of a global community of users.

    Although NSDL has always focused on education, precollege classroom teachers are only one of many audiences. In fact, the thrust of NSDL in its early years was to help the research community make available something—high-quality educational materials—that by and large didn't exist.

    Because NSDL serves several different purposes, the payoff from NSF's investment, which has averaged almost $18 million a year (see graph, p. 57), has been hard to quantify. Its biggest advocates admit that relatively few educators and researchers have even heard of NSDL, much less visited the Web site or contributed material. It's proven to be no match for Google as a search engine for finding good sites. And there's no evidence to date that NSDL has improved student learning.

    Although NSF officials insist that NSDL has been a success, the agency is in the process of reducing its support for digital libraries. Last year, the initialism NSDL was redefined as the National Science Distributed Learning program and subsumed under a new, broader cyberlearning initiative for which digital libraries are only a small component. In September, NSF cut its support to the organizations that manage NSDL by more than half and described the new round of funding as a “ramp-down … toward self-sufficiency.” The consortia operating the various Pathway portals say they don't expect to get another bite of the apple. In 2007, NSF ended its funding of DLESE, a digital library for earth system education that is separate from NSDL but serves as an informal pathway for the earth sciences community (see sidebar).

    Those in the trenches remain committed to making digital libraries an integral part of science education. But they face serious obstacles. Lai, who began teaching in 1971 and who admits that she's “technology challenged” by the profusion of communications devices now available, says she's found IA very user-friendly. “The site is easy for my kids to access, and it's very safe,” she says. Lai prefers IA (ia.usu.edu) to the school's Web site, for which she must give an instructional aide a list of URLs and which allow no supplemental materials. “And I can do IA from home,” she adds.

    So how often do Lai and the other teachers at Edith Bowen, all of whom were trained on IA several years ago, use this powerful new online tool? Recker has found that only 20% of teachers in one study were still using IA 1 year later, and Lai and her colleagues are no exception.

    “I haven't used it this year,” Lai confesses. “My third graders are just learning how to use the laptops” that are kept in the school's media lab and carted around to classrooms as needed. “And I don't know how many [of the other teachers] still use it. There's so much else that we're expected to be doing, we don't really talk about IA.”

    Promise and reality

    NSDL was launched in the waning days of the Clinton Administration, at the height of the dot-com boom, when expectations about the Web's potential were sky-high. One leading online policy guru, Thomas Kalil, then of the National Economic Council (and now a member of the transition team for President-elect Barack Obama), wrote in 1996 about the need to “leverage cyberspace” for economic growth. NSDL was seen as a way to do that in the realm of education, and Kalil and other White House officials promised that it would improve student performance, heighten student interest in science, and make high-quality material readily available to parents, teachers, and students.

    CREDIT: CAROL MINTON MORRIS © 2008

    Lee Zia, NSDL's longtime program director, says that those statements, in hindsight, were “idealistic. The rhetoric at the time—greater connectivity would usher in this new era of educational achievement—was compelling.” He insists that NSF never tried to hype NSDL. “We'd rather be in a position of underpromising and overdelivering,” he explains.

    NSDL was not NSF's first exposure to digital libraries. In 1994, it joined with other federal agencies to fund a team at Stanford University trying “to develop the enabling technologies for a single, integrated and universal digital library.” A few years later, two graduate students on that project, Sergey Brin and Larry Page, used the search engine being created to found Google.

    NSF officials weren't trying to replicate Google's commercial success when they launched NSDL. But as the Internet grew in importance, they began to ask how it could help teachers and students. There was already a lot of potentially useful content on the Web, some of it funded by other NSF programs and considered to be of high quality. But it was neither easy to find nor readily tailored for the classroom. As Zia puts it, “If you wanted to turn great piles of stuff into piles of great stuff, the first thing you needed was an infrastructure.”

    To provide that infrastructure, NSF funded “core integration” groups at the University Corporation for Atmospheric Research in Boulder, Colorado; Cornell University; and Columbia University to operate the main portal. They set out to develop technical standards, define the scope of the library, troubleshoot problems, and anticipate users' future needs. The service component was a new wrinkle on NSF's traditional approach of supporting cutting-edge research. “It was an unusual activity for NSF,” explains Zia. “NSDL was both a traditional program to fund research on an emerging technology [the Web] and an evolving entity that needed to be supported and sustained.”

    Once the administrative structure was in place, NSF put out a call for the community to compile collections that would link with NSDL. Over the years, NSF has funded 13 so-called Pathways serving both individual scientific disciplines and various user communities, including new ones this fall for the computing sciences and the quantitative social sciences.

    One major Pathway activity is to post material that other NSF programs had funded over the years, much of it created by those same organizations managing individual Pathways. “NSF has invested in a lot of educational resources, and this is a way to archive everything that the societies had done, including things that may have gone out of print,” says Yolanda George, deputy director of education and human resources at AAAS in Washington, D.C., which operates BEN (biosciednet.org). Each Pathway is also supposed to be the hub for an interactive community of users.

    Run by a consortium of 26 professional societies, BEN has put up 11,000 peer-reviewed resources—everything from scientific papers and reports to lectures and lab experiments—toward a goal of 25,000 by 2010. But to George, quality is more important than quantity. “Other people think that more is better,” she says. “I don't agree. I'd rather have a nice-sized catalog of peer-reviewed material that promotes active learning than a vast amount of stuff that hasn't been vetted.”

    Low quality isn't the only reason teachers might avoid Web-based material. Most online resources aren't aligned with the existing curricula of a local school district or with state standards that describe what should be taught. That makes it much less attractive for teachers already hard-pressed to cover what's required. Teachers must also be able to tailor the material to the needs of individual students.

    Many Pathway portals have tried to satisfy those requirements by attaching supplemental material to resources—journal articles, reports, and the like—not originally intended for classroom use. Cognitive scientist Tamara Sumner and her colleagues at the University of Colorado, Boulder, are tackling the issue head on. In 2007, she began working with secondary school earth science teachers in the Denver Public Schools to use NSDL resources to customize the district's curriculum in an interactive fashion. The curriculum was created by the American Geological Institute and is published by It's About Time, the Armonk, New York-based education division of Herff Jones that works mainly with NSF-funded curricula. Sumner, who has several NSDL-related research grants and is also co-principal investigator with Cornell on one of the two remaining infrastructure grants that NSDL supports, contributes both her expertise in teacher training and her knowledge of NSDL.

    A digital rise and fall.

    The budget of NSF's National Science Digital Library has fallen from its early days.

    CREDIT: (CHART) SOURCE: NSDL PROGRAM

    “We're creating Web 2.0 teacher guides for earth science courses,” she explains about a pilot study now under way to give Denver teachers an interactive platform to develop individualized lesson plans. It allows them to integrate information from the district's own IT system, which teachers now use to maintain student records and track their performance on ongoing formative assessments as well as year-end standardized tests, with material tailored to address the needs of students across a range of abilities, from gifted and talented to English language learners.

    Jim Short, then science coordinator for the Denver schools, says he was attracted by Sumner's previous work in organizing NSDL material to give teachers immediate access to exactly what they might need, when they need it. “I wasn't interested in more curriculum,” says Short. “But imagine how useful it would be for a teacher to link the concepts from an activity to an embedded assessment, then ask, ‘What key question would help me know if the student got the concept?’ and then not have to search for the answer because the appropriate resources are already tagged.”

    A former biology teacher who now directs the Gottesman Center for Science Teaching and Learning at the American Museum of Natural History in New York City, Short believes that the guides should help even the most experienced teachers. “I don't think teachers have the time or expertise to put it all together themselves,” he says.

    To teach a child.

    NSF has funded 13 Pathways projects to gather and disseminate material by discipline and sector. See below for URLs.

    CREDIT: N. KEVITIYAGALA/SCIENCE

    Ivy-covered indifference

    One group that presumably might have such expertise is rank-and-file academic scientists. Despite their vast disciplinary knowledge and a professional responsibility to educate the next generation, they have been slow to contribute to digital libraries, much less to NSDL. To find out why, NSF awarded a grant in 2005 to a group based at the University of Wisconsin, Madison.

    The resulting survey of faculty members at 300 U.S. universities (its title was “Lowering the Barriers to Faculty Participation in NSDL”) doesn't provide a direct answer. But it adds to the previously meager pool of information on how academics use online resources. The survey found, among other things, that they are prone to do their own searches and value speed as highly as they do quality, and that the material they download is most likely to be used in lectures. An overwhelming number say they are self-taught.

    “The biggest surprise to me is their reliance on Google,” says Alan Wolf, a co-author of the study, which he says was one of the first to ask such questions. Their methods are likely crude, he says—“they tend to spend a great deal of time looking for something that they assume exists”—and they often do it with a sense of urgency, “say, for a lecture the next day.”

    Wolf, a biologist who directs the New Media Center program at Wisconsin, doesn't know how elementary and secondary schoolteachers would answer a similar survey. But he guesses that “their use of Google as a search engine is likely to be similar.” He also speculates that the quality of what's on the Web is less of an obstacle to faculty members than to K-12 teachers. “They are experts and probably feel capable of judging the accuracy and suitability of the material” for their students, he says.

    An uncertain future

    In fact, Zia expects expert knowledge to play a bigger role in NSDL as its progress tracks the evolution of the Web. Begun as an information commons, the Web then became a mechanism to foster social networking and interactivity. But NSF is already planning for Web 3.0.

    “That's a return to editorialization,” explains Zia, on leave this year as a policy fellow in the off ice of U.S. Senator Jay Rockefeller (D-WV). “It's something to help the user get the content in the right context. It adds an interpretative component that's now missing.”

    Whether NSDL will be part of that next iteration of the Web remains an open question, however. The program was always an odd fit at NSF. The agency's deep roots in the academic research community don't necessarily help it nourish science teachers in thousands of local school districts. Those teachers need help solving problems that are often unfamiliar to academic researchers. “When we began 8 years ago,” says Utah State's Recker, “we assumed that we could build a resource bank of high-quality interactive material and that change would follow. That was naïve. Once we started to go into the classrooms, we realized the complexity of the environment.”

    NSDL must also operate within the rules that have made NSF a model agency for federal support of academic research: Applicants should address cutting-edge research questions, funding is for a finite period, and successful projects must figure out other ways to scale up or be sustained indefinitely. “A colleague of mine is fond of saying that there's only one thing that's certain about an NSF grant: It always ends,” says Sumner, who is quick to add that she's very grateful for NSF's continued support of her research at the University of Colorado, Boulder.

    Pulling it together.

    An NSF-funded project is giving these Denver high school teachers a chance to customize their district's curriculum with online resources.

    CREDIT: LYNNE DAVIS

    Still, Sumner and others are disappointed that NSDL isn't better known. “Could NSF have done a better job of marketing NSDL? Absolutely, but NSF doesn't fund marketing efforts,” she says. “They funded the research and the service components. There was never a product-development group with a marketing department, as would be the case for any commercial business.”

    In addition to a marketing department, companies also have an advantage over NSDL because their products are obvious. Identifying exactly what NSDL has to offer is much trickier. Although a contractor is laying the groundwork for outside experts to do a thorough program evaluation, Zia admits that “it's going to be much harder to figure out the impact of NSDL on a particular student or school.” Given the privacy issues involved in trying to trace a user's Web behavior, he adds, “I don't even know if it could be done.”

  11. New Landscape for the Earth Sciences

    1. Jeffrey Mervis

    A digital library for the earth sciences community and by the community is now also of the community.

    A digital library for the earth sciences community and by the community is now also of the community.

    Four years before the U.S. National Science Foundation (NSF) launched the National Science Digital Library (NSDL) (see main text), leaders in the earth sciences community began planning what would become DLESE (Digital Library for Earth System Education). In 2002, NSF took the plunge, making a 4-year award to the University Corporation for Atmospheric Research (UCAR) to operate the distributed, community-based library of materials for undergraduate education and research. But in 2005, after getting a scathing report from a visiting committee on how NSF's $21 million investment had been managed, NSF decided to phase out its support. After much soul-searching, a slimmed-down DLESE found a home in 2007 within UCAR's National Center for Atmospheric Research in Boulder, Colorado, where it is now part of NCAR's library.

    From the start, DLESE had twin goals: to unite earth and environmental scientists and to improve how those disciplines were taught, including the research component. In addition to vetting material for the library to assure high quality, researchers hoped DLESE would be their online watering hole, a place to foster collaborations and share their knowledge. But the NSF visiting committee found that DLESE's horizontal management structure actually inhibited good communications and that the sum of its various parts added up to less than the whole. “DLESE cannot be all things to all groups. … More is not always better,” its report concluded. The program's fiscal stewardship was another problem. “Even with the documentation provided [us], we still have almost no idea how $21 million have been spent over the past 5 years,” the committee noted.

    Christopher Maples, a paleontologist and former NSF program officer who chaired the panel, acknowledges that “we were pretty tough on them.” But the report also lauded DLESE for taking “a terrific, creative idea” and proving that it is “both promising and deserving.” Maples, who this fall became president of the Oregon Institute of Technology in Klamath Falls, says “I haven't followed them since our report. But I'm glad to hear that they've landed on their feet.”

    Jill Karsten, program director for education and diversity within NSF's geosciences directorate, certainly agrees. She says that DLESE's “soft landing” is exactly what the foundation hopes will happen to worthy projects after their NSF funding runs out. (DLESE was supported largely by the geosciences directorate, which also funds NCAR, rather than the education directorate, which has funded NSDL.)

    But Mary Marlino, DLESE's longtime director and now head of the NCAR library, sees the transition a bit differently. In addition to laying off staff, she had to jettison several community-based activities and outreach efforts. “We're at maintenance level,” she says. “It's no longer DLESE on steroids. But we've survived, and we're moving forward.”

    DLESE initially targeted undergraduates and those who teach them, Karsten says, but it has gradually moved into elementary and secondary schools. In one such project, DLESE worked with middle and high school earth science teachers to prepare electronic “teaching boxes” for the classroom. “In the early days, we thought that access would be the ticket,” says Marlino. “But now we realize teachers need to understand the resources in context and how to use them.”

    Earth science teachers will have to do it on their own, however. The instructional units, on topics from weather to plate tectonics, are online and ready to use (teachingboxes.org/). But after the loss of NSF funding, Marlino says there are no plans to develop any more boxes.

  12. A Nobelist's Passionate Pursuit of Excellence

    1. Jeffrey Mervis

    Nobelist Harry Kroto argues that the best educational resources often come from "people who are passionate about what they are doing and want to share it"--even if those materials haven't first been put under the sort of disciplinary scrutiny typical of material included in the National Science Digital Library.

    In its role as a collection of collections, the National Science Digital Library (NSDL) offers more than 1.5 million resources to science educators. But even that impressive figure doesn't capture the work of everyone who is generating serious online content. One academic who operates outside NSDL's network is Harry Kroto, 1996 chemistry Nobelist for his co-discovery of fullerenes.

    Kroto, who retired from the University of Sussex in the U.K. and moved in 2004 to Florida State University in Tallahassee to focus on science education, believes that it's possible to produce high-quality materials without following NSDL's protocol of first putting everything under a disciplinary microscope. Instead, he argues that the best materials often come from “people who are passionate about what they are doing and want to share it. I'm committed to the ideals of the Dead Poets Society—you know, the charismatic teacher being the vehicle to excite students.” That principle, he adds, is why Wikipedia has become so much more popular than Encyclopedia Britannica.

    Toward that end, he's built a studio on campus that films presentations from fellow scientists. The materials are then posted on a site called GEOSET (Global Education Outreach for Science Engineering and Technology). The process is idiosyncratic—“if I hear about a good presentation on a particular topic, I ask the person to come by,” he explains—and runs on a tiny budget drawn mostly from university start-up funds. “I generally like to show people what I can do before I ask them for money,” Kroto says. “It was the same for my research on C60.”

    With 75 modules available, the bulk aimed at students from high school through graduate school, Kroto says it's time to start thinking of scaling up his digital library (geoset.group.shef.ac.uk). “I'd love to see this happening on hundreds of campuses,” he says, ticking off collaborators in the United Kingdom, Portugal, Croatia, and Japan. “All it needs is a room with the right equipment and someone who's really committed to the task.”

  13. A Vision in Search of Funding

    1. Jeffrey Mervis

    Last summer, the U.S. government created an awkwardly named entity to fund research on how the Internet can improve U.S. education. But so far, the National Center for Research in Advanced Information and Digital Technologies exists only on paper.

    Last summer, the U.S. government created an awkwardly named entity to fund research on how the Internet can improve U.S. education. But so far, the National Center for Research in Advanced Information and Digital Technologies exists only on paper.

    Anecdotal evidence abounds of students getting excited about science through video games and other electronic educational resources. But the research to back up that premise is thin. Advocates of the new center persuaded Congress to take the first step as part of legislation reauthorizing U.S. policies toward higher education that became law last summer. But none of the $50 million that supporters are seeking in start-up funding has been appropriated.

    Its backers have been deliberately vague about what type of research the nonprofit center would fund so that experts can set the agenda—and to attract as much support as possible. Michelle Lucey-Roper of the Federation of American Scientists (FAS) in Washington, D.C., which has led the campaign, sees it funding “pre-competitive research on innovative learning tools serving all levels of society,” from preschoolers to retirees, and in settings as different as the battlefield and the factory floor. The center would be an odd creature in the federal research zoo: Although housed within the Department of Education, it would nevertheless have an independent policymaking board and be free to solicit money from other agencies and from the private sector.

    Supporters hope President-elect Barack Obama will ask Congress to fund the center in his 2010 budget submission. They note that his campaign promises to use technology to improve education and to double funding for education research point to his support for the concept. But FAS isn't waiting for the new Administration. It's already raised a bit of money to come up with an operating plan and management structure so that the center can hit the ground running if Congress ever funds it.

  14. Computers As Writing Instructors

    1. Greg Miller

    Software that helps students hone their writing skills is finding a niche in the classroom.

    Software that helps students hone their writing skills is finding a niche in the classroom

    CREDIT: (PHOTO MONTAGE): N. KEVITIYAGALA/SCIENCE (PHOTOS TOP TO BOTTOM) JUPITER IMAGES; JOHN CUMMING/GETTY IMAGES

    Can computers teach children to write better? Michael Jenkins, who teaches language arts at Estancia Middle School in central New Mexico, tells the story of Maria (a pseudonym), who so struggled to put her ideas on paper that she used to cry whenever he gave the class a writing assignment. That was before Jenkins began using writing-instruction software that provides feedback on students' essays and offers suggestions on how to improve them, all within seconds. By the end of the school year, Maria had more confidence in her writing abilities—and passed the writing portion of the state assessment test. “It's not a cure-all, but what a difference it's made in what the kids have shown they can do,” says Jenkins, who began using the software last year.

    He is not alone. Pearson Education, which makes the WriteToLearn software Jenkins uses, declined to provide an estimate of how many students use it, but its two main competitors, Vantage Learning and Educational Testing Service (ETS), each say that more than a million students use their respective software. Although teacher testimonials abound, few studies have directly investigated whether computer writing instruction works better than the traditional kind. Moreover, some skeptics question how well the programs capture the essence of good writing.

    Writing-instruction software has emerged in the past 5 to 10 years, but it traces its origins to the 1960s, when Ellis Page, a computer scientist who once taught high school English, realized that the time it takes for teachers to grade and return writing assignments limits the number of writing assignments they can dole out—and, therefore, the amount of writing practice students get. Hoping to speed things up, he identified several features that could be extracted from a writing sample by a computer—such as average word length and the number of prepositions—that correlated well with the scores human graders assigned. His automated grader gave scores that consistently agreed with human readers, but the punch-card data entry required by computers of the day limited its use.

    By the late 1990s, computing power had finally caught up, and several companies introduced software that expanded on Page's ideas and incorporated newer methods from machine learning, natural language processing, and statistics. (Page died in 2005.) These programs, for example, can analyze the vocabulary used in a passage to determine whether a writer strayed from the assigned topic and evaluate sentence variety—generally considered a mark of good writing.

    The first application of the software was to grade essays. Millions of essays are scored by computer each year as part of test-preparation software for standardized tests such as the Graduate Record Examination taken by prospective graduate students. The programs are also used in some high-stakes situations: the Graduate Management Assessment Test, taken each year by about 200,000 students aspiring to MBA programs, uses software in scoring its analytical writing section. (A human reader also scores each essay, and a second person weighs in if the human and computer scores differ widely.)

    Today, Vantage, Pearson, and ETS dominate the field. “All three use slightly different mathematical models but a similar basic approach,” says Scott Elliot, a private consultant who previously worked at Vantage. The programs typically have to be “trained” on 100 or more human-graded essays before they can score new material.

    So how well do these programs evaluate writing? “The system has to agree with a teacher as often as two teachers would agree,” says Peter Foltz, vice president for research at Pearson. Studies have found that the programs consistently hit this mark for the type of everyday writing that's found in standardized tests and high school essay assignments, says Mark Shermis, an educational psychologist at the University of Florida, Gainesville: “They handle 95% of the writing that's out there, but I don't think they will ever do poetry … or identify the next great novelist.”

    Over the past decade, several companies have adapted their essay-grading software to identify the strengths and weaknesses of an essay instead of simply issuing a grade. Rigorous studies on the effectiveness of this writing-instruction software are lacking, however. Those done by vendors typically examine the progress of a group of students using the software rather than comparing them with students who receive more traditional instruction, says Shermis. He has asked the Department of Education's Institute of Education Sciences to fund a study that would analyze the performance of two such groups on the Florida state assessment test. But Jenkins, who has taught for 14 years, says he is already convinced that the programs work.

    The software that Jenkins uses, sold by Pearson at a cost of $30 per student, generates colored bars that show students, among other things, how well their essay states and supports a main idea and whether their sentences flow smoothly and logically. Jenkins says his students can now write and refine an essay several times in a single class period; in the past, Jenkins needed a couple of days to return their first drafts. He welcomes the electronic assistance. “It builds their confidence because they keep trying till they get it,” Jenkins says. The software lets him spend more time with struggling students and also identifies topics that the entire class needs to review.

    Jenkins suspects that English language learners (ELL)—educationese for children who speak another language at home—may be among those who can benefit the most from using writing-instruction software. Last year, 92% of his ELL students passed the writing portion of the state assessment test, he says, compared with 31% of his ELL students before he started using the software. That percentage is also well above the statewide ELL rate of 58%.

    Richard Haswell, a professor emeritus of English at Texas A&M University, Corpus Christi, sees benefits and drawbacks to the writing software. Haswell says he's sympathetic to the plight of time-strapped teachers who want to incorporate more writing assignments into their classes, but he worries that “expediency is trumping validity.” One peril, says Haswell, who has studied both traditional and electronic measures of writing, is that the programs pick up quantifiable indicators of good writing—average sentence length, for instance—yet ignore qualities such as whether an essay is factually accurate, clear, or concise, or whether it includes an element of wit or cleverness. “Those are all qualities that can't be measured by computer,” he says.

  15. Glow Lights Up Scottish Classrooms

    1. Daniel Clery

    Schools across Scotland are flocking to link up with a national intranet that serves students, teachers, and parents.

    Schools are flocking to link up with a national intranet that serves students, teachers, and parents

    CREDITS: (PHOTO MONTAGE) N. KEVITIYAGALA/SCIENCE; (PHOTOS: LEFT TO RIGHT) JUPITER IMAGES; ISTOCK PHOTOS

    Jaye Richards is a biology teacher at Cathkin High School in the Glasgow suburb of Cambuslang. When she teaches the effects of pollution on rivers and seas, she asks her 14- and 15-year-old students to look far beyond Scotland to the River Don in Sheffield, England, the Yangtze River in China, and the Gulf of Mexico. She doesn't just turn them loose on the Internet, however. Instead, she taps into Glow, Scotland's national intranet for schools.

    By accessing video clips, podcasts, and newspaper articles—which Richards has identified in advance—her students can see how the theories they learned in class apply to the real world. “They can go to the Gulf of Mexico, find out the level of oxygen in the water, see the dead zones,” Richards says. The students collaborate on the tasks, communicate via instant messaging, and post their work on an online discussion board. Such a lesson is possible in any classroom with networked computers, but the teacher would need to be pretty tech savvy. Glow hopes to make such online learning routine. “For a standard classroom teacher, this is a very good system,” Richards says.

    Glow is essentially a Web site that aims to be a one-stop shop for Scotland's teachers, students, and their parents. A teacher who logs in can access e-mail within Glow. Discussion groups provide a forum for lively debates over curriculum, discipline, and other topics with colleagues in their school or across the country. Glow offers instant messaging as well as Web conferencing, using audio, video, or a shared whiteboard. And there are teaching materials, too. Students can use the system in class, in a moderated chatroom, or at home. “The pupils are talking about it,” says Richards. “I have to drag them out of the classroom at the end of the lesson.”

    The driving force behind Glow is Learning and Teaching Scotland (LTS), an agency funded by the Scottish government to develop the school curriculum. LTS began thinking about such a system in 2001 when the nascent Scottish parliament, created in 1999, was flexing its new policy muscles. Like many countries, Scotland had made large investments in providing computers for schools and access to the Internet. But their use varied widely among the country's 700,000 primary and secondary school students and their teachers, explains Laurie O'Donnell, director of learning and technology for the agency. LTS saw a national intranet as a way to help those among Scotland's 32 education authorities that were struggling, while encouraging all its teachers to collaborate.

    The Scottish government allocated $64 million to develop Glow and operate it for 3 years. LTS set about designing the system in collaboration with local authorities and the Oxford-based company Research Machines (RM). When it was ready, authorities were invited to come on board and provide Glow to their schools via their existing networks. LTS and RM provide training for local authority staff, who then train their own teachers. “It's driven by improving learning and teaching, not by forcing technology down teachers' throats,” O'Donnell says.

    Glow debuted in September 2007 and was immediately a hit: Half of Scotland's 32 education authorities are already using it, and all but two of the rest have signed up. (One decided that it couldn't afford the additional staff required to implement Glow, and a second has not signed up because of the bandwidth required.) “Glow looks to be groundbreaking,” says Seb Schmoller, head of Britain's Association for Learning Technology, a professional society for the field.

    It's even attracted notice far from Scotland's shores. Last summer, filmmaker George Lucas appeared before the U.S. Congress in his role as benefactor of the George Lucas Educational Foundation and held up Glow as a model of how schools could use information technology. “This kind of common platform makes perfect sense,” he told a panel of legislators looking to expand a program to wire up U.S. schools. O'Donnell says 20 countries have sent people to Scotland to see Glow in action.

    The biggest benefit of Glow, according to O'Donnell, is increased communication. Although Scottish schools already had access to the Internet and enough computers to tap into its vast potential, teachers were still developing their own teaching plans and materials and were generally unaware of similar efforts by other teachers, even those within the same school. “Glow creates an environment where teachers can collaborate and share,” says O'Donnell.

    Teachers have found they have much in common. Glow has helped a class in Dundee hold a videoconference with a museum curator who runs a recreated Victorian classroom. A school studying global citizenship linked up with a class in Malawi, and a violin teacher who routinely travels to remote schools in the region of Argyll and Bute is now able to hold a violin class simultaneously with six pupils in different places.

    A new Glow.

    Students in East Dunbartonshire, Scotland, work on Glow with Laurie O'Donnell, whose agency developed the project.

    CREDIT: LEARNING AND TEACHING SCOTLAND

    Glow has also made it easier for schools to meet the needs of their top students. Although Scotland offers extra final exams called advanced highers, small high schools often don't have suitable teachers. But Glow allows students to learn from a larger pool of teachers. “They can link to the best teachers in Scotland and the world,” O'Donnell says. LTS also expects parents to take advantage of Glow to keep up with their children's homework assignments, announcements, and school events.

    The system hasn't been around long enough for officials to assess its impact. But Richards devised her own study to see if Glow was worth the time and effort. With funding from the General Teaching Council for Scotland, Richards studied Glow's impact on four biology classes in her school, all of which followed the same syllabus. Three classes used the traditional syllabus for a module on water pollution, while Richards used Glow in one of three weekly lessons to reinforce her normal teaching.

    CREDIT: LEARNING AND TEACHING SCOTLAND

    An end-of-year exam and other assessments found that her class performed 14% better than the other classes in the pollution module, and 32% better in that module than they achieved in other modules taught without Glow. Stephen Draper, a researcher into technology and learning at the University of Glasgow, will help Richards with follow-up studies on how the teacher's level of experience and other factors affect student learning with Glow, and how much time and effort is required to use it.

    Draper says he was impressed by Richards's results but doesn't think technology is an educational cure-all. Instead, he believes that the Glow modules helped by connecting concepts to concrete examples. “It's not technology that causes learning but lesson design,” he says. The Glow lessons are self-paced and allow pupils to learn from one another. Meanwhile, Richards moves around the classroom dealing individually with problems, which Draper believes is “a much more sensible way to use your teachers.”

    Richards says Glow doesn't replace traditional teaching. But for students accustomed to using electronics outside the classroom, she says, Glow is an “enhancement. They are true digital natives. You have to tailor your teaching to the pupils in front of you, and Glow materials are suited to this.”

  16. Korea Tries to Level the Field

    1. Dennis Normile

    A cyber home learning system seeks to raise achievement and reduce the huge cost of private tutors, a practice not available to every Korean family.

    A cyber home learning system seeks to reduce the huge cost of private tutors, a practice not available to every Korean family

    CREDITS: (PHOTO MONTAGE): N. KEVITIYAGALA/SCIENCE; (PHOTOS IN MONTAGE TOP TO BOTTOM) JUPITER IMAGES, GOGO IMAGES CORPORATION / ALAMY, CORBIS

    NAMYANGJU, SOUTH KOREA—When Kwangdong Middle School lets out, many of the students in this hardscrabble town northeast of Seoul head for pricey cram schools or private tutors for help with their studies. It's part of an educational rat race throughout primary and secondary education, culminating in entrance exams that determine which students get into Korea's elite universities.

    It's also a big business: The Hyundai Research Institute estimates that the average family spent 19% of its household income on extracurricular primary and secondary education in 2006. That investment totaled about $23 billion and represented 4% of the country's gross domestic product. In the same year, the Korean government spent only $21 billion on education. A recent government survey found that 77% of primary and secondary students were enrolled in various academic classes to supplement regular classroom instruction.

    On a recent afternoon, however, a handful of Kwangdong students have headed to the school's computer room instead of going to private tutors. Like their classmates, they are getting individualized help to improve their school performance. But they are getting it for free thanks to Korea's online Cyber Home Learning System (CHLS).

    “In class, the pace has to be the same for all,” says Sang Mi Kim, a math teacher who has tweaked the CHLS system to suit the needs of students in her cybergeometry class. “[In the computer room], students study at their own pace.” Initially skeptical of e-learning, Kim is now a vocal supporter. “It really is effective,” she says. One of her students, Yun Sik Lee, says the video lectures and animations help him keep up with his regular class. “I log on every day after school.”

    Cyber convert.

    Sang Mi Kim, a middle school math teacher, has switched from being a doubter to a booster of the system.

    CREDIT: DENNIS NORMILE/SCIENCE

    Korean education officials hope that enthusiasm is contagious. Their goal is to wean many students and their parents away from private educational services and to create a more equitable and affordable system. They hope the 3-year-old CHLS can help reduce the imbalance between urban areas, where private tutors are plentiful, and rural areas, where they are rare, and also relieve the middle class of what is now a crushing financial burden.

    Paul Resta, director of the Learning Technology Center at the University of Texas, Austin, calls CHLS “a very coherent and comprehensive approach to enhancing teaching and learning by integrating the school experience with resources available in the home. It really is a model [for] other countries.” Resta chaired a jury that in 2006 awarded CHLS the UNESCO King Hamad Bin Isa Al-Khalifa Prize for the Use of Information and Communication Technologies in Education. The next year, the cybersystem won the IMS Global Learning Consortium's Learning Impact Platinum Award for the use of technology to enhance learning. Toru Iiyoshi, director of the Knowledge Media Laboratory at the Carnegie Foundation for the Advancement of Teaching in Stanford, California, and a member of the IMS award panel, believes the CHLS model could be extended to adult and vocational education. “We need more individualized learning. The future possibilities are just huge,” he says.

    Time to learn

    CHLS relies on the nation's well-established broadband Internet infrastructure. Korea has one of the world's highest rates of home personal computer ownership. Every public classroom and practically every household has broadband Internet access, and all public school teachers have computers and must take information technology training courses. So in 2004, when the Ministry of Education turned to e-learning to provide an alternative to private tutoring, the infrastructure already existed.

    From the outset, the objective was to supplement classroom instruction with materials students could use on their own either at school or at home with the support of cyberteachers, cybertutors (usually parents), and online counselors. “In a classroom, you have a limited amount of time,” says Jae-Myung Yang, director of the cyberlearning team at the government-affiliated Korea Education & Research Information Service (KERIS), which has led development of CHLS. “But with e-learning, you can have an extended amount of time with more individualized content and more personal interaction with teachers.”

    CHLS provides materials for grades four through 10. It covers major subjects—Korean, social studies, mathematics, science, and English—and elective subjects such as composition, Chinese, and drawing. For each subject, there are supplementary, basic, and advanced contents. Demonstrations and explanations feature animations as well as video clips involving real teachers. Students can replay the appropriate part of a lecture and request practice problems. Wrong answers are corrected immediately, and students receive feedback on areas they have mastered as well as those in which they need improvement.

    With access to a pool of exercises and problems, cyberteachers can work one-on-one with students or give online lectures to an entire class in real time. Students can also interact among themselves. There are even materials to help students prepare for midterm and final examinations, and they include room for the students'own notes. In some areas, CHLS also offers contemporary touches such as avatars and “cybermoney.”

    National and local governments invested at least $11 million to develop CHLS contents, and the system went live in April 2005. Although a national initiative, CHLS permits the country's 16 metropolitan and provincial offices of education to modify content to suit local priorities and recruit cyberteachers and regulate cyberclasses. Some local education offices have assigned those with special needs—such as disabled or gifted students—to limited-enrollment cyberclasses with specially trained teachers. Some teachers use CHLS to tailor homework for their regular students, whereas others have gathered cyberstudents from throughout their district. For example, Kim created a cyberclass with a customized home page and problem sets selected for her students, adding her own explanations for some concepts.

    Pleasing pages.

    80% of users reported benefits from using Korea's Cyber Home Learning System, which allows teachers to customize materials for their online classes.

    CREDITS (TOP TO BOTTOM): (SCREEEN GRAB) KERIS; (CHART) KOREA EDUCATION & RESEARCH INFORMATION SERVICE (KERIS)

    Asked why she volunteered, Kim flashes her glacier-melting smile and jokes that “there are predictions schools will disappear, and I felt my job was at risk.” In fact, she felt CHLS gave her the chance to stay at the forefront of trends in education. Watching a boy who was last in his class become one of her best students after a semester with CHLS erased her initial skepticism. “It's really effective because students can always log on and access material,” she says.

    Hak Song Lee, Kwangdong Middle School principal, says CHLS also gives shy students an opportunity to get help without embarrassing themselves. “Korean students hesitate to show [their classmates] they don't understand something. With the CHLS, they feel free to ask the teacher questions,” he says.

    Help for teachers

    By 2008, in its 4th year of operation, about 65% of the country's nearly 5 million students in grades 4 through 10 were enrolled in CHLS, although officials acknowledge that some students enroll but don't really rely on the system. It's serving half of those from small rural schools and low-income families—priority targets for Korean officials.

    Despite international praise and high marks from users, Korean education officials consider CHLS a work in progress. “We are trying to improve its effectiveness,” says Hyun-Jin Cha, a researcher at KERIS. Officials are tweaking content and trying to attract more elite students from affluent urban families. “What we hear from many students,” she says, “is that they have no time to work with the CHLS between their regular classes and their private tutoring.”

    A national student survey by KERIS found that more than 80% of respondents reported better grades, increased confidence in a subject, or other benefits (see illustration). It also seems to be having the desired effect on parents' pocketbooks: Nearly one in six CHLS users said they had dropped or were considering dropping private tutors, an estimated savings of $1.8 billion.

    The biggest issue, according to some observers, is attracting and retaining the interest of talented teachers. Cyberteachers get at most nominal additional compensation, and a study by Hakjin Bae and colleagues at the Korea National University of Education in Chungbuk found that they received no systematic training on the system. And the teacher is key, says Kim. CHLS “is just a tool; how successful it is depends on the teacher,” she says.

    KERIS is looking at ways to keep teachers motivated and provides guidelines on training them. But those policies ultimately rest in the hands of the local educational offices. If better training and compensation for cyberteachers can be figured into the equation, more Korean students are likely to be turning to CHLS when school lets out rather than making a beeline to private tutors.

  17. A Personal Tutor for Algebra

    1. Yudhijit Bhattacharjee

    Commercial software created in the lab anticipates wrong answers and reinforces needed skills for first-year algebra students.

    Commercial software created in the lab anticipates wrong answers and reinforces needed skills for first-year algebra students

    CREDITS (TOP TO BOTTOM): (PHOTO MONTAGE) N. KEVITIYAGALA/SCIENCE; (PHOTOS) JUPITER IMAGES.

    In 1983, psychologist John Anderson wrote a computer program to test his theory of how people solve math problems. The program was designed to compare the steps subjects take in solving a problem with those predicted by a theoretical model. It also gave students—typically in high school—feedback to get back on track if they were having trouble. Anderson, a professor at Carnegie Mellon University (CMU) in Pittsburgh, Pennsylvania, expected the studies to uncover flaws in his theory. After being pleasantly surprised at “how well it seemed to work,” Anderson had another revelation: “Here was something that could be used to improve student learning.”

    His program became the foundation for Cognitive Tutor (CT), a software product being used to teach algebra in more than 2600 middle and high schools across the United States. Developed at CMU and commercialized by a spinoff company called Carnegie Learning, CT has been shown to be effective in a small, randomized field study, the gold standard for evaluating curricular interventions.

    Yet there's no guarantee that using CT will improve learning: In at least two other studies, the product led to no significant rise in student test scores. And even though it has been around for 15 years, only now is the software going through a large-scale trial. Experts say the cloudy picture shows how difficult it is to rigorously evaluate educational technologies.

    Nonetheless, researchers say CT has benefited from two factors rarely found together in instructional products—a product with deep roots in academia that the school decided to market. Most commercial tutors are developed by textbook publishers, with little or no research behind them, says Alan Lesgold, an education professor at the University of Pittsburgh. And interventions developed at university departments rarely get the financial backing to advance beyond a small scale, he says.

    CT's creators say that ongoing improvements to the software would not have been possible without the steady revenue from sales or its enduring relationship to the university. CT “embeds the principles of good, cognitively based instruction into a well-designed software environment,” says Mitchell Nathan, an education professor at the University of Wisconsin, Madison, who has no connection to CT. “The tutor has continually grown.”

    Here's a hint

    What distinguishes CT from the majority of tutors on the market is the precision with which it is able to assess a student's skills and provide instruction tailored to that assessment at different steps in the problem-solving process. Many tutors assume that a student has learned a concept if he or she gets the right answer, says Lesgold. Conversely, he says, CT attempts to “figure out what's going on inside the kid's head.”

    CT presents students with a series of progressively more difficult problems. It provides hints tailored to the particular difficulty that an individual student may be facing. These hints are the tutor's fundamental instructional tool, says CMU cognitive psychologist Kenneth Koedinger, one of the creators of CT and a co-founder of Carnegie Learning.

    Take the equation 3(x - 2) = 21. If a student makes an error trying to simplify the left side of the expression and writes “3x minus 2,” the tutor might provide a hint saying “you forgot to multiply correctly.” The system would simultaneously note on a “skillometer” that the student is weak at “distribution,” that is, applying the multiplication function to all the quantities within the brackets. If the student repeats the error, the system might provide a second, more obvious hint, such as “multiply 3 by 2.” The system would provide a different set of hints for students pursuing another approach, say, dividing both sides by 3. If the student repeatedly requests a hint, the system offers many examples of the same type of problem to build mastery of the skill.

    The software's ability to present relevant hints is based on a technique developed by Anderson and his colleagues in the 1980s called model tracing. The tutor essentially maps the student's error to one of many possible violations of mathematical rules. The tutor's ability to track the student's expertise in acquiring the needed skills to solve a task is based on a related technique called knowledge tracing.

    Top tutors.

    From left, Ken Koedinger, John Anderson, and Steve Ritter created Cognitive Tutor and have watched it grow.

    CREDIT: TODD KELLY

    Anderson was working on these techniques when Koedinger joined his lab in 1986 as a graduate student. For his thesis, Koedinger developed software to teach geometry proofs and tried it out at a local high school. That's where Koedinger discovered the importance of professional development. “The teacher I was working most closely with got the best results,” he says. He also decided to integrate the software with the rest of the curriculum. Armed with these insights and a grant from the U.S. National Science Foundation (NSF), Koedinger and fellow CMU researchers decided to create a complete algebra course, including a textbook and software. Students typically use CT twice a week and receive regular instruction the remaining 3 days.

    After funding from NSF ended in 1993, the researchers kept CT alive with grants from local charities. “These organizations were very concerned about strengthening Pittsburgh's technology base,” Anderson says. Campus efforts to develop educational technology usually don't receive that kind of support, says Lesgold. He says CT benefited from the desire of local foundations such as the Heinz Foundation to see “social return on their investment.”

    Eager to commercialize the software, CMU founded Carnegie Learning Inc. in 1998, combining the university's own capital with outside investments. Anderson and Koedinger were given a stake in the company, and Steve Ritter, another member of the founding team who earned his Ph.D. under Anderson in 1992, was hired as its chief scientist.

    The university first considered licensing the software to textbook publishers, Ritter says, but without any requirement that they improve it. He says a second option, making it open source, would have helped fix bugs but would not have provided the necessary resources for further development.

    Ritter says independent studies have shown that the software helps students learn algebra. In a 2001 study that met the U.S. Department of Education's standards of evidence, 224 middle school students in Moore, Oklahoma, scored higher on average on a standardized test after using CT than did 220 fellow students taught by the same teacher using a different textbook. Nearly one-third of the students who used CT in this study were rated proficient in an Educational Testing Service assessment compared with 17% taught using traditional materials.

    Other studies have failed to show any significant rise in student achievement, however. The first part of a $10 million government study on the effectiveness of reading and mathematics software reported in April 2007 that CT was one of five algebra products that, in a collective evaluation, produced no improvement in student performance compared with a control group. Another randomized study in the Maui School District in Hawaii, which only looked at CT, came to the same conclusion.

    Ritter says the different results illustrate the difficulties in evaluating the effectiveness of educational technology: The outcome depends heavily on the way they are tested and used. Both studies measured the impact of CT after only 1 year, which he says is not long enough for teachers to become familiar with it. “People think technology somehow minimizes the effects of the teacher, but if the teacher isn't trained and committed to making the curriculum work, then it can be hard to show results.” A $6 million randomized field trial launched by the RAND Corp. in 2007 will follow students in 130 schools across six states for 5 years in an attempt to measure long-term impacts.

    Instructional synergy

    Steve Spence, a math teacher at Old Mill High School in Millersville, Maryland, likes that CT keeps track of an individual student's minute-by-minute performance. “I can go into a student's account and know how long they have been working on a section, how many times they clicked on the hint button, how many errors they made,” says Spence. “I can then try to focus on whatever skills the student is missing.”

    CT has something to offer both stronger and weaker students. Tenth grader Aneisha Vester used CT last year to leapfrog ahead of most of her classmates and complete all 40 chapters in the tutor. (The rest of the class did less than half that number.) “It gave me something to do,” says Vester, who adds that proceeding on her own from one chapter to the next gave her a great sense of accomplishment. At the same time, Old Mill math teacher Janet Liimatta says the flexibility of being able to login from anywhere, anytime provided another student with the extra time needed to acquire the necessary skills.

    Henry Kepner, president of the National Council of Teachers of Mathematics in Reston, Virginia, says there's always a danger that instructional software will become less effective once the novelty wears off. “After a while, it becomes routine and they learn to play the system,” he says. Spence agrees that teachers must intervene if a disengaged student tries to breeze through entire sections by repeatedly clicking the hint button. “The instructor and the tutor [must] work together,” he says.

Log in to view full text

Via your Institution

Log in through your institution

Log in through your institution