News this Week

Science  13 Nov 1998:
Vol. 282, Issue 5392, pp. 1234

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution


    Democrats Match GOP in Sending a Physicist to Washington

    1. David Malakoff

    Democrats now have a physicist of their own in the U.S. Congress. In an upset victory last Tuesday, Rush Holt, a one-time student of solar winds, edged out first-term Republican Mike Pappas to capture New Jersey's 12th district congressional seat. Holt joins Michigan Republican Vernon Ehlers, who easily won a fourth term, as the first two Ph.D. physicists elected to the House of Representatives. “If we can find a room with a chalkboard, we'll form a bipartisan physics caucus,” an ebullient but exhausted Holt joked following his 50% to 47% win.

    Holt's win was just one unexpected result in a surprising election that shattered GOP hopes of strengthening their narrow majorities. The results—a net loss of five House seats and no change in the Senate—led to the resignation of House Speaker Newt Gingrich (R-GA) and prompted challenges to other party leaders. But although the probable new speaker, Representative Bob Livingston (R-LA), is unlikely to match Gingrich's vocal enthusiasm for science, most of Congress's other ardent R&D supporters retained their seats by comfortable margins. Among the returnees are the architects of the National Institutes of Health's recent $2 billion increase, Representative John Porter (R-IL) and Senator Arlen Specter (R-PA), along with Representatives James Sensenbrenner (R-WI) and George Brown (D-CA), chair and ranking member of the House Science Committee.

    On the other side of the country, voters in Washington state passed a referendum that dismantles the state's affirmative action programs. The Washington initiative—which asked voters to ban discrimination but did not specifically mention affirmative action—will prohibit governments and universities from using race and gender as criteria for selecting employees, contractors, and students. It makes Washington the second state, after California, to roll back preference programs. But state officials say there will be years of litigation before the measure's full impact is known.

    Meanwhile, researchers and science lobbyists hope that Holt's victory will provide a sorely needed boost to the scientific expertise of the nation's legislative branch, which contains only a handful of scientists and engineers. Holt, 50, has been assistant director of the Princeton Plasma Physics Laboratory in Plainsboro, New Jersey, since 1989 and spent 2 years in the late 1980s as a State Department arms control adviser. Scientists around the country contributed to his campaign, and Ehlers sees his election as a sign that scientists—often chided by politicians for their political naiveté and lack of activism—are becoming more interested in making a mark at the polls. “The science community has become much more politically energized in the last 2 years,” believes Ehlers, who says he is “delighted to have another physicist aboard.” He is disappointed, however, by the loss of conservative ally Pappas.

    Although Pappas was considered among the House's most vulnerable incumbents after a narrow victory in 1996, many pundits had discounted Holt's chances. In particular, they doubted he could raise enough money to campaign effectively in the sprawling suburban district, which stretches across central New Jersey and includes the science-rich Princeton University campus. But Holt, who earned a doctorate at New York University in 1981 and later taught at Swarthmore College in Pennsylvania, raised nearly $1 million in cash and in-kind donations, more than any other Democratic challenger in New Jersey.

    Most of Holt's nearly 4500 cash gifts—which totaled about $880,000—came from traditional Democratic constituencies, including teachers, union members, and abortion rights activists. But several hundred scientists coughed up significant time and money too, according to campaign officials and Federal Election Commission records. Among them were more than a dozen Nobel Prize winners, including biologist David Baltimore, president of the California Institute of Technology in Pasadena. Supporters also included scores of Holt's former colleagues from Princeton, the American Physical Society (APS, which awarded him a Congressional Science Fellowship in 1984), and the Department of Energy (DOE), which funds the plasma lab. DOE science chief Martha Krebs, for instance, gave the campaign $250 in June. “We made a major effort to reach out to the scientific committee,” says physicist Sherrie Preische, a former Princeton graduate student who took time off from her job at the APS to serve as the campaign's treasurer.

    One researcher who answered the call was Princeton physicist Norton Bretz, who donated a total of $2000, the maximum allowed for both the primary and general election. “Scientists usually focus on their own work, but [Holt's candidacy] jarred us a bit and got us interested,” he said. Another $2000 donor was APS Editor-in-Chief Martin Blume, a physicist at Brookhaven National Laboratory in Upton, New York, who calls himself “a bipartisan supporter of electing more scientists to Congress.” Blume, who in the past has given to candidates from both major parties, says that “the degree of scientific illiteracy in Congress is a serious problem, but few of us have any idea of how to get elected. Rush and Ehlers do.”

    Holt's political savviness is no surprise to those who know his family history. His father was, at 29, the youngest person ever elected to the U.S. Senate, representing West Virginia for one term in the 1930s. And his mother served as West Virginia's secretary of state. “I was raised to think that politics was an honorable calling,” Holt told The New York Times earlier this year.

    Holt didn't dwell on his scientific credentials during the campaign, and as a freshman from the minority party he is unlikely to have a significant impact on national science policy. But he did claim “technical expertise that is so rare in Congress and political expertise that is so rare in science.” He promises to be a “strong advocate for R&D” but declines to say whether he is seeking a seat with direct responsibility for science policies or spending. He also knows that his narrow win marks him as vulnerable in the 2000 campaign. “There are a dozen Republicans already thinking about running against me,” he says. “I say, bring 'em on.”


    Understanding of Ears, Bristles Jumps a Notch

    1. Gretchen Vogel

    The arrangement of cells in the inner ear, which allows a music lover to sense harmonies, is itself as complex as a Bach fugue. It is orchestrated during development when precursor cells in the inner ear organize into a mosaic of sensory patches made of hair cells, which sense vibrations, neurons, which send messages to the brain, and supporting cells. In the December issue of Development, researchers describe part of the system of molecules that creates this intricate pattern. In an example of evolution's tendency to reuse basic mechanisms, it turns out to be the same system that guides the development of a much simpler sense organ in the fruit fly.

    Developmental biologist Julian Lewis of the Imperial Cancer Research Fund in London and his colleagues demonstrate that proteins called Delta and Notch, already known to help pattern the vibration-sensing bristles on a fly, control the development of the hair cell mosaic in the inner ear of zebrafish and chicks. Other evidence from mice, which developmental biologist Matthew Kelley of Georgetown University has presented at several meetings, suggests that the same proteins are at work in mammalian ears as well.

    Lewis's work “begins to give us an idea of the molecular pathways that govern the development of the mosaic,” Kelley says. And it may eventually have clinical applications: In mammals, including humans, damaged hair cells are lost for good, but birds can regenerate them. Understanding how the cells develop in the first place “begins to give us insight into the pathways that prevent regeneration,” Kelley says.

    Delta and Notch are powerful determinants of cell fate in both the fly and in vertebrates. For example, as a group of vertebrate pre-neuronal cells matures, one cell gets slightly ahead of its cousins. Delta, which is lodged in the cell's membrane, interacts with the Notch receptor on neighboring cells, preventing them from becoming neurons. In the developing fly bristle—a miniature sense organ on the fly's head and body, which is made of a neuron and accessory cells—Delta-Notch signaling seems to work in a similar fashion to determine which precursor cells become neurons, bristle shafts, and supporting cells.

    Because of the similarities between bristle and hair cell structure and function, scientists had suspected that the inner ear might also use Delta-Notch signaling in development. To find out, Lewis, Julie Adam, Anna Myat, and their colleagues looked for expression of the Delta gene in the ears of chick embryos. A few hours before the first neurons appear, the scientists found Delta expressed in scattered cells. Several days later, Delta was expressed again at just the site where the mature hair cells appeared a few hours later. By the time the hair cells were recognizable, Delta had nearly disappeared, but the two bouts of Delta expression are “strong evidence” that Delta guides neuron and hair cell development, says developmental neurobiologist Jeffrey Corwin at the University of Virginia, Charlottesville.

    In a second paper in the same issue, Lewis, Catherine Haddon, and their colleagues suggest that a similar process also controls ear development in the zebrafish. They examined the embryonic ears of a mutant zebrafish called mind bomb, so named for its excess of neurons. No one has yet pinpointed the gene responsible for the fish's bumper crop of neurons but researchers think that the mutation somehow blocks Delta-Notch signaling. Sensory patches in the inner ears of mind bomb fish become “wall-to-wall hair cells,” says Lewis, with no visible supporting cells—every cell becomes a hair cell.

    Given that fly bristles and ear hair cells do basically the same thing and use the same developmental genes, Lewis and his colleagues propose that they may have evolved from a common ancestral sensory structure. But Corwin isn't so sure. Delta-Notch signaling is so common that “it may be like a subroutine in computer programming that evolution uses over and over,” he says. So it's possible, says Corwin, that evolution reused the system in unrelated organs.

    However the system evolved, understanding it may be useful. After chick inner ear cells are damaged, they express Delta as they regenerate, according to developmental neurobiologists Edwin Rubel and Jennifer Stone at the University of Washington, Seattle. These findings, under review at Development, may bring scientists a little closer to the day when “we will be able to restore hair cells in the human ear,” says Rubel—an achievement that may allow today's headbangers to enjoy Bach in their old age.


    Microsoft Picks Beijing for New R&D Lab

    1. Justin Wang*
    1. Justin Wang writes for China Features in Beijing.

    BeijingMicrosoft Corp. is making a major research investment to help it capture and retain a large chunk of China's fast-growing computer business. Last week, the software giant announced that it will open a research center in Beijing, and it pledged to spend $80 million over 6 years to make computers more user friendly for speakers of Chinese, including voice recognition, information retrieval, and machine translation systems. The long-term, fundamental research will be “China oriented,” says Kai-Fu Lee, managing director of the lab, which will be known as Microsoft Research (MSR) China.

    The Beijing facility is the second overseas venture established by Microsoft Research, a $200 million division of the Redmond, Washington-based company. In June 1997, it opened its first such facility in Cambridge, England (Science, 20 June 1997, p. 1783). The two sites, together with two U.S. labs, employ some 300 scientists working in such areas as speech recognition, databases, user interfaces, and three- dimensional (3D) graphics.

    The Beijing center will occupy 3000 square meters on the sixth floor of an office building in Zhongguancun, an area already home to dozens of research institutes of the Chinese Academy of Sciences and not far from Beijing and Tsinghua universities. The northwestern suburb is also known as a Chinese version of “Silicon Valley” for its concentration of computer and electronics companies. The new lab is expected in 3 years to grow from a half-dozen employees to around 100 researchers. “The strength of China's economy and the quality of its academic system” were major factors in choosing the location of the new lab, says Jack Breese, assistant director for Microsoft Research.

    Lee, former president of Cosmo Software, the multimedia software business unit of Silicon Graphics Inc., joined Microsoft in July to head the new Beijing facility. He has been a pioneer in the areas of speech recognition, artificial intelligence, 3D graphics, and multimedia. Born in Taiwan and raised in the United States, Lee, 37, received his Ph.D. in computer science at Carnegie Mellon University in Pittsburgh, where he helped to develop a speech-recognition system that doesn't have to be trained to respond to a particular voice as well as a program for Othello, a board and computer game, that defeated the human world champion.

    MSR China hopes to expand Microsoft's ties with China's computer science community by sponsoring international seminars, supporting journals, funding academic studies, setting up links to universities around the world, and hiring Chinese students after they have finished their studies abroad. “Our research will be focused on forward-looking studies” that should appeal to the best students, Lee says.

    China is the fastest-growing information market in the world, says Lee, with 30% annual growth rates for PCs and estimated sales of 8.3 million in 2000. Earlier this year the company opened the Greater China Regional Support Center in Shanghai, and its efforts to upgrade Windows CE, an operating system for handheld computers, mark the first time Microsoft has formed teams in both China and Redmond to tailor products for the country. The company also has an agreement with the Ministry of Information and Industry to promote its products on the Internet in China, and it recently signed agreements with six large Chinese software companies to bundle Windows NT and SQL, for database management, into their business applications.


    Canada Opens Program to Community Groups

    1. Wayne Kondro*
    1. Wayne Kondro writes from Ottawa.

    OTTAWA—Canada's research granting councils traditionally channel funds into the academic community. But last week one of them took the revolutionary step of making public-interest groups eligible for grants from a new research program to attack such societal problems as poverty, illiteracy, and poor health.

    The new activity, called Community University Research Alliances (CURA), is being funded by the Social Sciences and Humanities Research Council (SSHRC), one of the country's three major funding councils. Over the next 2 years, SSHRC will make 3-year, $160,000 awards to 16 centers that will plant the seeds for what the council hopes will grow into a national network of university and community researchers working on projects that serve local needs in the social sciences. Council president Marc Renaud says that making community groups eligible is the only way to make them feel like “true partners” in the venture, which is modeled after a long-running program in the Netherlands. “If partners means that it's always the university that calls the shots, that controls the budget, and that gives resources free of charge to these projects, then maybe we're not talking about real partners,” Renaud says.

    The very notion that community groups can apply for research grants intrigues Montreal social activist Alice Herscovitch. As director of Project Genesis, an advocacy group for the poor and elderly, Herscovitch has often served in an advisory capacity on collaborative projects with universities. But the experience has been less than satisfying. “Being part of an advisory committee means you have absolutely no input or, if you really push, very little,” she says. “The process, the analysis, and even the final results—we don't have access to them.”

    Keven O'Brien, head of Canadian Feed the Children, envisions a CURA project to examine malnutrition across the nation and the quality of food provided by breakfast clubs. Peter Dawe, head of the Newfoundland and Labrador division of the Canadian Cancer Society, can imagine a half-dozen issues that a CURA center could explore, including attitudes toward alternative health therapies, the efficacy of public education campaigns, and the degree of self-examination for breast cancer. “There's a desperate need for systematic, evidence-based problem-solving,” he says.

    Not everyone welcomes the new rules, however. Some university administrators worry that SSHRC's decision will further dilute the already limited resources available for academic research and may not easily fit into traditional university reward systems. “There's currently not very good means to evaluate [outreach scholarship] and certainly little recognition of it in either tenure or promotion,” says University of Manitoba dean of arts Ray Currie. “If the community is the origin of these [centers] rather than the university, I think it will be harder to convince the universities that this is credible scholarship” and harder to convince campus researchers to participate.

    Renaud says the potential benefits from building better “bridges” between the campus and the community are worth the risk of upsetting a few academic apple carts. “If it's a pilot, you might as well make it a real pilot,” he says. “Either we're making a mistake or we're making history. You tell me.”


    NSF to Send College Students Into Schools

    1. Jeffrey Mervis

    Kristin Guthrie took as few science courses as possible as a student at Iowa's Luther College. And as a second-grade teacher at Mary Lin Elementary School in Atlanta, she admits that science lessons used to get squeezed out to make room for other subjects in a crowded curriculum. But that was before her “science partner” arrived. Now her kids can't wait for their 90-minute, 2-day-a-week science lesson. And neither can she. “That time is sacred—it's when we do science,” says Guthrie, a 10-year veteran at the kindergarten-to-grade-5 school, which serves a predominantly poor, minority population near downtown Atlanta. “The partner brings so much to the classroom, it's just wonderful.”

    Guthrie's science partners—science- literate students from one of seven Atlanta-area colleges and universities—come to Mary Lin as part of a program funded by the National Science Foundation (NSF) that sends undergraduates to help as teachers' aides in 69 elementary schools. The Atlanta program, called Elementary Science Education Partners (ESEP), is one of several such initiatives around the country, often involving scientific societies as well as universities. Now NSF has decided to go national with a variation of the concept. The new program would send graduate and upper-level undergraduate students into elementary and secondary school classrooms by offering them something on a par with the prestigious NSF graduate research fellowships.

    NSF officials have yet to work out details of the program, which would begin with a special grants competition this winter. But it's a favorite of Rita Colwell, NSF's new director, who sees university students as an untapped source of talent and enthusiasm for teaching (Science, 25 September, p. 1944). “This is definitely going to happen,” Luther Williams, head of NSF's education directorate, told a group of public school and university educators late last month. “And I'm asking you to think about how you can get involved.” As things stand, the 3-year awards would go to universities that link up with local school districts on programs bringing together college students and teachers. And the $12 million pot—carved out of NSF's current 1999 budget—is seen as a downpayment on a larger effort in 2000 and beyond.

    The program's primary goals are to improve the quality of K-12 science and math instruction and promote teaching as a career for science-savvy students. “We need good teachers just as badly as we need new scientists,” says Robert DeHaan, a cell biologist at Emory University School of Medicine in Atlanta and the creator of ESEP, which gives college credit to undergraduates who serve as science partners.

    A successful program, say educators, also could alter the mindset of university faculty, who typically place research at the apex of a scientific career. “It has the promise of changing the culture,” says Jan Tuomi, director of a project at the National Academy of Sciences called Resources for Involving Scientists in Education ( “It says this is an award for an alternative career path, not just something for scientists who have ‘failed’ in the lab.”

    Educators already involved in such projects say that they welcome NSF's heightened interest. “Any help that you can provide a teacher is useful, and anybody you can turn on to [K-12] teaching would be great,” says Hewlett-Packard's Jan Hustler, director of the Bay Area (California) Schools Excellence in Education project, which has grants from HP and NSF to involve scientists from industry and academia in training teachers at 83 area elementary schools.

    Hustler and others emphasize that, to succeed, such programs have to provide the teachers' aides with proper training and good mentors and place them in schools already in the midst of reforming how children learn science. In addition, success requires a commitment from the university, especially at the graduate level, where students are expected to immerse themselves in a research project. “A lot of faculty may be wary of having their students mucking around in a classroom,” says Ramon Lopez, an astronomer at the University of Maryland, College Park, and head of education and outreach at the American Physical Society, whose Teacher Scientist Alliance Institute trains researchers to work in their local schools. “But I think that those who are open-minded will welcome it.”

    Despite the hurdles facing NSF's new program, Guthrie says ESEP has shown that the right environment can make for a great partnership. “I'm still in charge,” she says about the students she has worked with. “I take care of discipline and make sure that we follow the curriculum. But they often bring in other material that I don't have access to. And even when they use terminology that might be a little too sophisticated, the kids think it's really neat to be taught by a scientist.”


    Reaction to Stem Cells: A Tale of the Ticker

    1. David Malakoff

    For the Geron Corp. of Menlo Park, California, it was déjà vu all over again last week, as the company announced research results that sent investors into a tizzy. In January, the biotech company's stock almost doubled in price when Science published a paper by a Geron-funded researcher reporting a way to extend the life-span of human cells (Science, 16 January, p. 349). But the price soon sank as investors realized that the scientifically interesting findings wouldn't soon lead to profitable products. Last Thursday, it happened again.

    Geron, which has been operating in the red to the tune of $40 million since 1994 and is still years away from profitability, saw its stock price jump, then slump, when company-supported researchers reported in Science and the Proceedings of the National Academy of Sciences that they have cultured “immortal” stem cells in the laboratory (Science, 6 November, pp. 1014 and 1145). The cells could potentially be used to repair damaged organs and tissues.

    Geron's stock prices over the week spanning publication of the Science paper tell a revealing tale:

    • On 30 October, Science sent more than 1200 reporters an “embargoed” notice of the stem cell paper a week in advance of publication. They are not allowed to report the findings publicly until the following Thursday at 4 p.m.

    • As reporters began to prepare their stories, rumors about the findings began to circulate and Geron's stock edged upward. On 2 November at 10:58 a.m. Eastern Standard Time (EST), an anonymous Geron investor posted this message on a Yahoo! stock buyers' bulletin board: “[Geron] stock is going through the roof this morning. … What's the news? Does anyone know what is going on?”

    • Sometime after noon EST on 5 November, the French Press Agency put out a story about the findings at least 3 hours early. By 1 p.m., Geron's stock price jumped by $2. By 4 p.m., when the embargo officially ended, the price was around $10. “It looks like everybody on the street knows what the news is except us,” e-mailed an exasperated investor, who hadn't yet seen the story.

    • On Friday, after the findings made the headlines, Geron's stock soared to $24.50, roughly four times its price a week earlier. “The reaction of the stock price is absurd,” Jim McCamant, editor of Medical Technology Stock Letter in Berkeley, California, warned the Associated Press.

    • After a weekend of reflection, investor interest slumped and Geron prices slid to $13.75. The discovery, opines one analyst, “is a lot more significant scientifically than commercially.”


    Binaries Answer Riddle of Brown Dwarf Origins

    1. Alexander Hellemans*
    1. Alexander Hellemans is a writer in Naples, Italy.

    A brown dwarf is a poor excuse for a star. Too small for gravity to ignite its nuclear furnace, a brown dwarf is heated primarily by the energy released as it contracts, although some deuterium fusion may take place. As a result, its surface at its hottest barely reaches 2000 degrees—our sun's surface is 5800 degrees—and it glows more like a hot coal than a star, emitting most of its energy in the infrared. But in one respect a brown dwarf is undeniably a star, as is shown by two recent studies of brown dwarfs, one of them in this issue of Science. Unlike planets, which take shape from debris that remains after the formation of their parent sun, a brown dwarf can form on its own out of interstellar gas and dust, like any respectable star.

    Astronomers discovered the first brown dwarfs only 3 years ago, and only a dozen or so of these dim, barely visible objects have been documented, some near other stars and some on their own. The question of how they originate has not been settled, however, because an isolated brown dwarf could have formed like a planet, around a star, and then been flung into interstellar space by gravitational interactions. But astronomers have now imaged brown dwarfs with companions—in one case another brown dwarf, in the other a star too young to have planets—that show the brown dwarfs must have formed from interstellar clouds.

    Because brown dwarfs are at their brightest when they are young and still contracting, young star clusters such as the Pleiades—just 100 million years old—have become the main hunting grounds for these objects. A team led by Eduardo Martin of the University of California, Berkeley, used the NICMOS infrared camera on the Hubble Space Telescope to scour the Pleiades for a kind of object that might settle the origins question: a binary brown dwarf. If brown dwarfs form from collapsing gas clouds, as stars do, then you should see binaries made up of two brown dwarfs, just as ordinary stars are often found in binary pairs. And as the team will report in Astrophysical Journal Letters, they found such a pair: a system, dubbed PL 18, made up of two brown dwarfs with masses just above and just below 50 Jupiter masses, orbiting each other every 1000 years at a distance 42 times the Earth-sun distance.

    Although an isolated brown dwarf could have formed like an oversized planet and then been flung into space, says team member Wolfgang Brandner of NASA's Jet Propulsion Laboratory in Pasadena, California, a binary would not survive that treatment. “It follows that brown dwarfs must form like stars, from clouds that collapse and then fragment.”

    Support for this view of brown dwarf formation comes from another find by Rafael Rebolo of the Astrophysical Institute of the Canaries in Tenerife, Spain, and his colleagues. The group—two members of which also worked with Martin on the brown dwarf binary—reports on page 1309 that it has found a tiny brown dwarf circling a star using ground-based telescopes in the Canaries. “We have imaged what is the lowest mass substellar object so far found orbiting a star,” says Rebolo.

    To find this system, “we chose stars that are much younger than the sun,” says team member Maria Rosa Zapatero Osorio. They identified 52 of these young stars—the ones most likely to have detectable brown dwarf companions—by looking for lithium, an element that is formed during the big bang but is gradually burned up in the nuclear furnace of stars and is only visible early in a star's lifetime. Close to one of the stars, they saw a second dim, lithium-containing body—a brown dwarf, which they call G 196-3B.

    They estimate that the two components are separated by roughly 100 times the Earth-sun distance, and that the brown dwarf's mass may be as low as 15 Jupiter masses. The team believes that the binary system is about 100 million years old—too young, says Ralph Neuhäuser of the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, for the dwarf to have formed from an accretion disk, like a planet. He adds, “Because the brown dwarf is so far away from the star, fragmentation of a molecular cloud is the most likely scenario for its formation.”

    The evidence that brown dwarfs are just like other stars, at least by birth, is likely to get stronger, although formation in accretion disks is not ruled out by the astronomers in some cases. Zapatero Osorio says that since the Astrophysical Journal Letters paper was submitted, the team has identified another 20 brown dwarf candidates in the Pleiades and imaged a second binary brown dwarf system. Brandner believes that the number of brown dwarf discoveries will increase fast. The Pleiades, for example, contain about 600 known stars, and “it looks like there are as many brown dwarfs as there are stars,” says Brandner.


    Cold Spring Harbor to Offer Own Degrees

    1. Eliot Marshall

    While some scientific leaders are trying to persuade universities to reduce the number of Ph.D.s they award in the life sciences, Cold Spring Harbor Laboratory (CSH) in New York announced last week that it is going in the opposite direction. It's creating a new grad school that will offer Ph.D. candidates a shorter, more student-oriented graduate experience.

    The biology lab won state accreditation in September to open a School of Biological Sciences, and it will enroll five candidates in its inaugural class next fall, says CSH director Bruce Stillman. The school, which will be headed by CSH assistant director Winship Herr, eventually will expand to 10 students per year. Stillman says CSH hopes to raise an endowment of about $1 million per student, enough to free them from some of the pressures of obtaining outside funding.

    CSH has been educating people “since its inception” as a research field station a century ago, says Stillman. It currently has more than 50 Ph.D. candidates from the State University of New York, Stony Brook, on campus and thousands more taking short courses tailored to everyone from world-class scientists to high school novices. Stillman says the degree-granting program will give the lab a more direct role in shaping the education of students on campus. The new program, he says, will help CSH “change the way education is done.”

    The lab's main innovation, according to Stillman, will be to shorten the time it takes to get a Ph.D.—from the standard 7 years to 4.5 years. CSH hopes to get most of the basic instruction done in the first year, although students would take short courses (some lasting no more than 1 week) throughout their time at the lab. In another change from the standard approach, Stillman says, Ph.D. candidates will have two mentors—one to guide them through the details of preparing a thesis and the other to look out for their intellectual development. The goal, he says, is to ensure “that the research will benefit [the students] and not necessarily reflect what a [National Institutes of Health] study section thinks is important” as an experiment.

    The supply of new Ph.D.s is a contentious issue, especially in the life sciences, where recent graduates complain of a paucity of academic positions. In September, a committee of the National Research Council (NRC) of the National Academy of Sciences headed by Princeton molecular biologist Shirley Tilghman argued that there should be “no further expansion” of existing Ph.D. programs and “no development of new programs.” This week, a report from the Association of American Universities (AAU)* warns against “the unnecessary proliferation of Ph.D. programs” and says that new programs should meet “a regional or national need.”

    Tilghman, who is also a trustee of CSH, calls the new graduate program “an interesting demonstration project” that could be an exception to the situation described in the NRC report. If the lab really can train a Ph.D. in less time than “even the best programs in the country” now require, says Tilghman, it will have done something “important.” Furthermore, because students at CSH “will not be tied to the research programs of their mentors,” Tilghman says, they will gain the degree of independence her panel wanted to encourage.

    AAU Executive Vice President John Vaughn, one of the authors of the AAU's new report, says a few universities are now trying to limit the number of years Ph.D. candidates serve as teaching assistants. But it may be difficult to implement such changes at some big state universities, he says, without adding to the pressures students already face.


    Pasteur Recruit Resigns in Battle Over New Unit

    1. Michael Balter

    ParisJoseph McCormick has worked successfully in a lot of challenging environments. He's chased the Ebola and Lassa fever viruses in the jungles of Zaire and Sierra Leone, and he's battled hepatitis C and cholera epidemics in Pakistan. But none of those experiences prepared him for life at the Pasteur Institute in Paris. On 1 November McCormick resigned as chief of the epidemiology and biostatistics unit he had been hired to create less than a year ago after a tenure committee decided to postpone a decision on granting him permanent status.

    McCormick's rapid rise and fall appears to be part of a broader debate over the role of epidemiology at the Pasteur, which traditionally has put a heavy emphasis on basic research. It also reflects the political infighting at the Pasteur in the run-up to elections next year for a new director-general to succeed Maxime Schwartz, who cannot run again and whose leadership style is seen by some as authoritarian. McCormick, formerly of the U.S. Centers for Disease Control and Prevention (CDC), was recruited after a stint in Pakistan by Schwartz and medical director Philippe Sansonetti, who is regarded as a candidate for the top job (Science, 13 March, p. 1629).

    The trigger for his resignation was a June vote by the tenure committee, an elected body of Pasteur scientists, to delay for 1 year a decision on his status. Unfortunately for McCormick, the scientist who presented his case, microbiologist Patrick Grimont, is often at odds with Sansonetti over the institute's future. McCormick says that Grimont showed little knowledge of the project during a brief visit before the panel's vote, and sources familiar with the committee's deliberations say that his very negative presentation contributed to the 14-0 vote. “McCormick paid the price [for his ties to Schwartz and Sansonetti],” says one Pasteur scientist who asked not to be identified. “He was parachuted in and he didn't know” the score.

    Grimont disputes that interpretation of the committee's action. “I did my job [as presenter of McCormick's dossier] as honestly as I could,” he says. “There were no political or personal influences” that affected his presentation. Even so, Schwartz says that the panel's vote was “completely unexpected.” And Sansonetti views McCormick's departure as a “setback” to his wish to strengthen the institute's public health portfolio.

    Administering a setback may in fact be what the committee had in mind. Although most were reluctant to discuss their decision publicly, several told Science that they had reservations about McCormick's plans to create a CDC-style epidemiology program at Pasteur. They also felt that McCormick's extensive experience as a field epidemiologist did not fit the academic environment at Pasteur, which prizes more “fundamental” research in epidemiology and biostatistics. Other sources at Pasteur said that McCormick's desire to create a high-powered pathogen laboratory in his unit was seen as too similar to existing research at the institute. “I am surprised that he wanted to have a lab, with a lot of equipment,” says Grimont. “What was lacking at Pasteur was an epidemiology unit, not a lab looking for bugs.”

    Despite the dispute over McCormick's plans, committee members say their action was not meant to be a push out the door. “Joe McCormick was not fired from Pasteur or anything like that,” says genome researcher Antoine Danchin, a member of both the tenure committee and the scientific council, a separate body that approved McCormick's hiring and the creation of his unit. But McCormick says the decision forced his resignation, as scientists without tenure are not allowed to recruit other researchers for their units. Molecular biologist Moshe Yaniv, also a member of both bodies, agrees that McCormick's inability to recruit people for his unit “would certainly have complicated his life.”

    Although McCormick's epidemiology unit will be disbanded, Schwartz and Sansonetti still hope to create a new program that will carry out much of the same work. McCormick plans to maintain some ties with Pasteur, serving as a consultant on a variety of projects, including vaccine evaluation. He also hopes to be a liaison to a high-security pathogen lab in Lyons that his virus-hunting wife, Susan Fisher-Hoch, is helping to construct. His background as a field epidemiologist does not seem to bother the Lyons-based vaccine firm Pasteur Mérieux Connaught, which has just hired him to put together an epidemiology program.

    Looking back at his short stint at the institute, McCormick says that “if I did something wrong at Pasteur, I don't know what it was.” But he confesses that, when it came to politics, “I might have been a little naïve.”


    Genome Links Typhus Bug to Mitochondrion

    1. Elizabeth Pennisi

    As recently as the First and Second World Wars, the louse-borne disease typhus swept through armies, ghettos, and prison camps, killing millions of people. Instability and the breakdown of public health measures in Eastern Europe have experts worrying about possible new epidemics of the disease, which is marked by high fever and delirium. But a close look at Rickettsia prowazekii, the bacterium that causes the disease, reveals that, in spite of its fearsome reputation, it is a degenerate organism, riddled with nonfunctional genes and gradually losing genes it once needed to function.

    In this week's issue of Nature, molecular microbiologist Charles Kurland of the University of Uppsala in Sweden and his colleagues describe the complete sequence of the 1.1-million-base pair genome of the pathogen. By helping identify genes that make R. prowazekii so deadly, the information may help researchers design better typhus vaccines. The sequence, now one of 18 microbial genomes finished, is also a window to the distant past.

    Researchers think that the mitochondria, the small structures that serve as the cell's powerhouses, were derived from bacteria that took up permanent residence in an early ancestor of modern cells. Comparisons of ribosomal RNAgenes had indicated that Rickettsia, one of the so-called alpha proteobacteria, could be the closest living relative of the mitochondria's predecessor. Now, Kurland says, the new genome sequence “is as confirmatory as you can imagine” about the link between mitochondria and Rickettsia. It also illustrates the gene loss that must have marked the mitochondrion's own transition to dependence on the host cell.

    Kurland and his colleagues, who began the sequencing project 6 years ago, found 834 genes in the Rickettsia genome, a half-dozen of which code for proteins similar to those that make other bacteria virulent. Three of these look like the genes that produce toxic polysaccharides in Staphylococcus aureus, which causes boils. The information should help researchers interested in developing new vaccines for typhus find the right proteins to include in their inoculations, Kurland says.

    The effort also seems to have paid off in helping pin down the origins of the mitochondria. With the sequence in hand, Kurland, Uppsala's Siv Andersson, and their colleagues compared the Rickettsia genes to the DNA still present in modern mitochondria. “We see very strong similarities,” says Andersson, particularly in genes involved in energy production. The group also found that many of the pathogen's genes closely resemble genes that code for proteins used by yeast mitochondria—but are found in the nucleus of yeast cells.

    This suggests, Kurland says, that somehow “there was an early evolutionary event where there was an off-loading of these genes” from the early mitochondrion to the nucleus. As the ancestral host nucleus took on these genes, the mitochondria would have become more dependent on the host cell, until eventually they could no longer survive except within the cell.

    R. prowazekii hasn't taken up permanent residence in cells yet, but it is an obligate intracellular parasite, meaning that it can multiply only in living cells. As a result, Kurland thought its genome might show signs that genes once needed by the organism when it could reproduce independently are being lost. The new sequence indicates he and his colleagues were on the mark. The genome “is a wonderful study in the way genomes evolve to become degenerate,” says evolutionary biologist Carl Woese of the University of Illinois, Urbana.

    When Andersson surveyed the microbe's existing genes, she found that several key ones, including those needed to make the building blocks of DNA, are missing. Thus, the organism has to depend on the cells it infects to produce these materials. What's more, Andersson adds, the sequence indicates that the “genome is still in the process of getting smaller.” She points to an enzyme, called S-adenosylmethionine synthetase, which makes a compound that adds methyl groups to a variety of cellular building blocks. Met K, the gene that makes the enzyme, has been found in all the microbial genomes sequenced so far except for that of Chlamydia, another organism that can thrive only inside other cells. In R. prowazekii, however, this gene is altered and is no longer expressed.

    Several other recognizable “genes” no longer work because of mutations in their sequences. In fact, Kurland and his colleagues found that functional genes take up only 75% of R. prowazekii's DNA, whereas all of the other bacterial genomes have little extraneous DNA. “With several dead genes and a lot of noncoding DNA, its percentage [of junk DNA] is higher than [that of] any other microbial genome,” Andersson says. Woese expects to see more examples of such gene inactivation in the genomes of parasitic microbes. This observation is “probably going to be a trendsetter for the field.”


    Training Viruses to Attack Cancers

    1. Elizabeth Pennisi

    A growing assortment of viruses that replicate in and kill cancer cells, but not normal tissue, may be new weapons in the war on cancer

    Viruses have spawned more than their share of misery over human history. Cancer has an equally grim record. Now several companies and about a dozen research teams are working on recruiting one kind of scourge against another. They are developing viruses that are either naturally harmless to normal tissue or have been genetically altered to make them so and are turning them loose on cancer cells. The hope is that the viruses will do what doctors, with their scalpels, chemotherapy, and radiation beams, all too often cannot: eradicate cancer without damaging normal tissue.

    Here the viruses play a different role than they do in experimental therapies that rely on viruses to ferry therapeutic genes into cancer cells, where the new genes might correct the genetic errors underlying the uncontrolled cell growth. These gene-carrying viruses have been disarmed so that they can't multiply and spread. But the secret of the new anticancer viruses is precisely their ability to replicate and spread, killing the cells—albeit only within the cancer. “It's like a chain reaction that spreads until it gets to the tumor boundary,” explains Jeffrey Ostrove, a virologist with NeuroVir Inc. in Vancouver, British Columbia.

    So far, researchers have come up with a half-dozen of these tumor-killing viruses, the latest of which is a reovirus described on page 1332 by Patrick Lee, a virologist at the University of Calgary in Alberta, and his colleagues. The reovirus—a type of virus that doesn't cause problems in humans—is not yet in clinical trials, but two other viruses are, and early results from one indicate that it can shrink tumors, particularly when used in conjunction with other therapies. “A lot of people are very excited because of the lack of side effects and the hope of specificity to cancer cells,” says Steven Linke, a molecular biologist at the National Cancer Institute in Bethesda, Maryland.

    Much more work will be needed to see whether this preliminary promise will hold up. But viruses that simply kill cancer cells—so-called oncolytic viruses—are just the first wave of this new type of cancer therapy. Also in the works are oncolytic viruses that not only kill cancer cells but also carry genes that make the cells more susceptible to radiation or chemotherapy, thereby delivering a double blow to the tumor. As such, they represent “a whole new avenue of potential treatments,” says Robert Martuza, a neurosurgeon at Georgetown University in Washington, D.C.

    Ras appeal

    Lee and his colleagues didn't start out looking for new ways to treat cancer; they were using human reoviruses to study how viruses in general work. Reoviruses grow fast, are easy to work with, and are apparently harmless to people, although they can kill newborn mice. Researchers already knew that, in order to infect cells, reoviruses have to latch onto molecules of sialic acid on the cell surface. But Lee and his colleagues realized that something more is also needed, because the virus replicates only in a subset of the cells that carry sialic acid.

    In 1993, Lee's team found a clue to what that might be by showing that the virus does better in cells that also have surface receptors for a molecular signal called epidermal growth factor (EGF). Three years later, Lee's group showed that it wasn't the EGF receptor per se that is important but the signaling pathway the receptor activates when it binds the growth factor. And this year, they reported that what enables the virus to thrive is one particular component of that pathway, the protein made by the ras gene. To replicate, the virus needs the Ras protein because it blocks the activity of another protein in the cell, called PKR, that would otherwise prevent the synthesis of viral proteins.

    That discovery pointed Lee toward the current work, because ras is one of the oncogenes that can, when inappropriately activated, spark cancer cell growth. Lee realized that the virus would probably replicate readily in tumors that have an overactive ras gene, which include some colon, pancreatic, and lung cancers. To test this idea, he and his colleagues transplanted cells from a human brain cancer called glioblastoma into immune-deficient mice that would not reject the cells. The cancer cells had high levels of Ras protein because of mutations in proteins that control the oncogene's activity. After the cancer had taken hold, the researchers shot the tumor full of virus. “The virus was extremely potent,” says Lee. The tumors shrank or disappeared in 65% to 80% of the mice tested.

    Lee notes that unpublished results from his team show that the virus also kills cultured cells derived from breast, prostate, and pancreatic cancers, but none of noncancerous cell lines, which have low ras activity, tested. “They've shown pretty convincingly that this virus has specificity for ras-mutated cells,” says Frank McCormick, a molecular biologist at the University of California, San Francisco.

    McCormick, who earlier led the team that developed another oncolytic virus at ONYX Pharmaceuticals, a biotech firm in Richmond, California, worries about the safety of reovirus in humans, however. He notes that some of the treated mice in Lee's experiments died, presumably as a result of the infection. But other experts dismiss the concern. Reoviruses “are completely nonpathogenic in people,” says virologist Wolfgang Joklik of Duke University in Durham, North Carolina. “I wouldn't worry about them.” Lee is working with the Canadian government to get permission to try the reovirus in people with breast or head and neck cancers that haven't responded to conventional therapies.

    In the clinic

    Other oncolytic viruses are already in the clinic. The farthest along is the one developed by ONYX, a genetically modified adenovirus called ONYX-015. Whereas the reovirus targets cancer cells with an activated ras, the modified adenovirus is supposed to work in cancer cells in which the tumor suppressor gene p53 isn't doing its job of preventing cell growth. Usually adenoviruses, which can cause flulike symptoms in people, make a protein that blocks p53 activity, which would otherwise prevent the virus from replicating. But the ONYX researchers deactivated that gene in their adenovirus, which consequently should replicate in and kill only those cells—including many cancer cells—whose p53 is out of commission for other reasons (Science, 18 October 1996, pp. 342 and 373).

    Earlier this fall, Anthony Hall and his colleagues at the University of Otago in Dunedin, New Zealand, questioned this picture, reporting in Nature Medicine that the virus does infect cancer cells that have a normal p53 gene and even seems to need the gene to destroy the infected cells. But McCormick thinks that these cell lines have other genetic changes that inhibit p53 activity. Linke adds that the bottom line is how well the viruses kill tumors: “If these viruses can selectively kill tumor cells without adversely affecting normal cells, perhaps the genetic status of the tumor cells is not such an important issue.”

    Preliminary results indicate that ONYX-015 can meet those criteria. The first clinical trials of the virus took place in 1996 and demonstrated that it is safe to use. Since then, two more groups, each consisting of 30 patients whose head and neck cancers had not responded to previous therapies, have been undergoing treatment with the adenovirus.

    The results so far show that the virus alone shrank tumors by at least 50% in slightly more than a third of the patients studied. But “even more dramatic has been the efficacy of the virus in combination with chemotherapy,” says McCormick. Within a month, tumors completely disappeared in two of the first 10 patients treated with both therapies, and in seven more, the tumor shrank by more than 50%, with no significant side effects. Overall, ONYX-015 is proving to be “definitely much more effective than indicated from our mouse data,” says McCormick. ONYX has also started testing this virus against pancreatic, colon, and ovarian cancers and plans to use it in people with brain tumors soon.

    In July, another company called Calydon Inc., based in Sunnyvale, California, began a clinical trial of its own, somewhat different adenovirus, CN706, in men with recurrent prostate cancer. Company scientists had modified this virus by splicing into its genome the control DNA that normally regulates the expression of prostate-specific antigen, a protein made only in prostate cancer cells. In the adenovirus, the regulatory DNA turns on a viral gene that spurs viral replication, but only in response to the right combination of hormones and transcription factors. Because this array of molecular messages is found only in prostate cancer, “it's a very neat way of targeting the specificity [of the virus],” says Jonathan Simons, an oncologist at Johns Hopkins University in Baltimore, Maryland, who is assessing the safety of CN706 for Calydon.

    Results of this trial are not yet available, but Simons says he is excited about the potential of using these viruses to treat prostate and other cancers that grow rather slowly. Because conventional therapies are designed to attack rapidly dividing cells, they don't work well on prostate cancer. “But [adenoviruses] kill independent of the cell cycle,” he points out.

    A third oncolytic virus that has moved into clinical trials is a herpesvirus produced by Georgetown's Martuza. Herpesviruses can cause encephalitis and other problems in humans, so Martuza and his colleagues needed to disable the virus so it could no longer reproduce in normal cells. In the first stage of this work, completed in 1991, they inactivated genes that produce enzymes the virus needs to replicate. As a result, the virus could multiply only in actively dividing cells, such as cancer cells, that make enough of these enzymes themselves. The researchers then demonstrated in lab culture that the modified virus destroys glioblastoma cells but not normal cells (Science, 10 May 1991, p. 854).

    To further ensure that the virus is safe, the team has since knocked out a virulence gene that enables the herpesvirus to cause encephalitis. Because of the multiple changes in the virus's genome, “the chance that it can revert to the wild-type is virtually zero,” says Martuza. The herpes simplex G207, as the new, improved version is called, “so far has been effective in essentially all solid tumors,” tested either in laboratory dishes or rodents, says Martuza, and the animals suffered no detectable ill effects. Tests to assess the safety of G207 in humans began last February and will ultimately include two dozen patients with glioblastoma.

    One other virus, a small, nonpathogenic virus called a parvovirus, went through preliminary human trials 8 years ago that showed it is safe to use. But its developers, Jean Rommelaere and his colleagues at the INSERM lab of the German Cancer Research Center in Heidelberg, Germany, wanted to improve the virus's tumor-killing potential before proceeding with further tests. “In the race between tumor proliferation and viral amplification, sometimes, the tumor is the winner,” Rommelaere explains. So his team has spent several years developing ways to give the virus an added advantage, such as by adding genes that will recruit immune system cells to aid in tumor killing. In new cell-culture and animal studies, the virus's anticancer effect is now “more pronounced,” he adds.

    Other researchers are also trying to bolster the tumor-killing potential of the oncolytic viruses. At the University of Alabama, Birmingham, James Markert and his colleagues find that adding genes for the cytokines interleukin-2 or interleukin-5 boosts the immune system's attack on the tumor. Others are trying to modify the viruses so that they will make the cells more susceptible to traditional cancer treatments as well as kill them directly.

    For example, at Massachusetts General Hospital in Boston, neurosurgeon Antonio Chiocca and his colleagues have added a rat gene for a protein called cytochrome P-450 to the genome of a herpesvirus. Cytochrome P-450 converts cyclophosphamide, a drug used for cancer chemotherapy, to its active form. Consequently, as this virus spreads through a tumor, it not only kills the cells directly but also makes them susceptible to cyclophosphamide. Chiocca's team is now evaluating this virus in animal studies. And there's talk of putting in genes for compounds that will make a tumor more sensitive to radiation. “That's the beauty of this viral technology: With one agent you can deliver an oncolytic effect, a pro-drug activating effect, even a radiation-sensitizing effect,” Chiocca says.

    Of course, early excitement about a potential cancer therapy often gives way to disappointment, or at least realism. “Caution must be exercised, since the long-term side effects are not really known,” says Linke. But the concept of making tumors get sick and fade away has undeniable appeal, says Simons. “This is the kind of thinking we need in new cancer pharmacologies.


    Popular Interest Fuels a Dinosaur Research Boom

    1. Erik Stokstad

    Paleontologists are learning to capitalize on the popularity of dinosaurs, and new discoveries, labs, and exhibits are the result

    In 1994, paleontologist Cathleen May was running out of time and money. The University of California, Berkeley, graduate student had discovered an Apatosaurus skeleton in Curecanti National Recreation Area, near Gunnison, Colorado, but the bones were in danger of eroding away and her grant from the National Park Service was too small to excavate them. Many a similar skeleton has been left in the field for lack of funds, but May found an unexpected savior: Hollywood. She hooked up with an L.A. animation firm keen on creating a virtual dino dig for kids, gave interviews, let the company film the site, and wound up with $24,000 over 2 years. That was enough to finish the dig. The 20-meter-long Apatosaurus, one of the oldest known, is now headed for the Museum of Western Colorado in Grand Junction.

    Hollywood isn't such an unlikely sponsor these days. After 65 million years of extinction, dinosaurs have conquered school yards, bookstores, and the video rental market. And the insatiable public appetite for the beasts is boosting research. Students are crowding into dinosaur paleontology classes, corporations and philanthropists are pledging support and donating specimens, and money is flowing into the field from movie and book spin-offs. Such nontraditional funding has its dangers, and despite all the activity, few researchers are flush with funds. But some say such sources are the key to survival for dinosaur paleontology.

    Many researchers agree that popular enthusiasm and funding have combined with new discoveries to reanimate the field. A new analysis suggests that the number of dinosaur papers is on the rise, and new positions are appearing at a time when other areas of paleontology are barely holding steady. Spectacular fossil discoveries follow one upon another. This issue of Science reports the latest find: an African specimen with a fish-eating, crocodilelike skull, which paleontologist Paul Sereno of the University of Chicago and his colleagues describe on page 1298.

    The field wasn't always so active. Dinosaurs have long been popular with the public, but scientifically they were a sleeper from the 1930s through the 1970s. Despite big dinosaur exhibits, most major museums had no Ph.D. dinosaur paleontologist. “Dinosaurs were considered gee-whiz things, good to show to the public but not particularly important from an evolutionary point of view,” recalls Edwin Colbert, retired curator of dinosaurs at the American Museum of Natural History (AMNH) in New York City.

    But in the 1970s, the field was rocked by the controversial idea that dinosaurs were warm-blooded and active like birds. The notion that the last dinosaurs were wiped out by an asteroid impact stirred even more interest. From then on, research seemed to take off: Since 1969, the number of dinosaur genera described has more than doubled, to about 350, notes Peter Dodson of the University of Pennsylvania School of Veterinary Medicine in Philadelphia. Papers have surged too. Back in the late 1980s, only about one in 10 papers in the Journal of Vertebrate Paleontology was on dinosaurs, according to a new analysis by Richard Cifelli, a mammal paleontologist at the University of Oklahoma, Norman, and former JVP editor. But by 1997, almost 25% of the articles in JVP were on dinosaurs.

    Many paleontologists say that public interest bordering on mania has pumped specimens, students, and even jobs into the field. The movie Jurassic Park, for example, grossed nearly $900 million—vastly more money than all government agencies combined have ever spent on vertebrate paleontology. Although scientists don't share directly in these profits, the enthusiasm those figures reflect “doesn't hurt,” admits Sereno, who has been featured in a half-dozen television documentaries and was listed as one of People magazine's 50 Most Beautiful People in 1997.

    In 1986, paleontologist Timothy Rowe's first year at the University of Texas, Austin, his dino survey class had one of the largest enrollments in department history. “In some ways it's been my meal ticket here,” he says. The next year, when he added labs to the course, he was able to fund six graduate students in paleontology as teaching assistants. Because many public universities distribute funding by the number of undergraduates taught, “a course that brings in 500 students really turns heads,” says mammal expert Cifelli, who also teaches a dinosaur survey course.

    And although many of the larger museums still haven't hired dinosaur specialists, popular interest has fueled the birth of regional museums that rely on dinosaurs as the main attraction, such as the Museum of the Rockies in Bozeman, Montana, and the Royal Tyrrell Museum of Paleontology in Drumheller, Alberta. “Where there used to be five museums where you could see dinosaur collections, now there's literally hundreds,” says Sereno. Each one creates opportunities for paleontologists.

    When it comes to research money, private support can make a huge difference, because agencies give only moderate support to paleontology. Last year, the National Science Foundation gave out about $1 million worth of new grants in vertebrate paleontology; almost half went to three dinosaur projects. The National Geographic Society handed out almost $316,000 to vertebrate paleontologists. But the JVP has received $500,000 in private donations in the past 4 years for supporting fossil preparators and paying publishing charges in JVP.

    And more than 70 researchers got small grants from an organization called The Dinosaur Society, founded in 1991. Two years later, the society teamed up with Steven Spielberg and Universal Studios to put out a traveling exhibition based on Jurassic Park, showcasing casts of skeletons and eggs, as well as studio props and merchandise. Part of the proceeds went to the society, which began to give out peer- reviewed grants for dinosaur research, often supporting exploratory field trips that agencies won't fund. By 1997, when the exhibition was canceled, the society had handed out more than $980,000. “The Dinosaur Society was an experiment of science going into business, and it worked,” says Steve Gittelman, president of a marketing firm who served as the society's second president. In addition, since May 1997, the Jurassic Foundation has amassed about $150,000 from The Lost World exhibit, which will be distributed as grants next year.

    Other private foundations also dig deep for dinosaur research. Sereno has received substantial support from the Packard Foundation and Pritzker Foundation. Rowe thinks that dinosaur appeal helped him win a major grant from the Keck Foundation for a high-resolution computed tomography scanner.

    A few enterprising paleontologists have managed to tap public interest in other ways. Lou Jacobs, a mammal paleontologist at Southern Methodist University in Dallas, wrote two general-interest books about Texas dinosaurs. In the early 1990s, a chain called Half Price Books, headquartered in Dallas, agreed to donate their profits on the books—about $50,000 so far—to paleontology.

    Sometimes the fossils themselves are sources of funds. When the most expensive fossil in the world, a Tyrannosaurus rex named Sue, was bought at auction in 1997 for $8.4 million, the Field Museum of Natural History in Chicago enlisted the support of McDonald's, Disney, and the California State University system. The deal will support a prep lab, two staff positions, six preparators, and a postdoc.

    Indeed, corporations hold the big money, and they're often willing to spend some in exchange for a tax break and cheap advertising. Mercedes Benz supplied field vehicles for the recent AMNH expeditions to the Gobi desert, and American Airlines flew 6 tons of African dinosaur fossils and some of Sereno's crew back to the United States in 1993. “The possibilities are unlimited,” says May, who is now director for Policy and Environmental Issues at the Geological Society of America.

    But paleontologists are often uneasy with corporate funding. Potential donors must be catered to, says Gittelman. “These things irritate the psychology of the scientist,” he says, “and most of them won't do it.” Partly it's the worry of a stigma. “Nontraditional sources are still almost like dirty money,” says May, “because they may be—and often are—generated not on the scientific importance of the endeavor but on spin-off benefits like education or career training or goodwill value.” And the visibility sometimes gives the mistaken impression that dinosaur research is richly funded, says Sereno. In fact, he says his work is “a hand-to-mouth operation.”

    Yet other paleontologists don't seem to begrudge the popularity of dinosaurs. “This rise fuels a lot of the rest of the science, so those of us who don't work in dinosaurs are perfectly delighted with these trends,” says Cifelli. “Dinosaurs are a vehicle for highlighting other specimens,” says mammal paleontologist John Flynn of the Field Museum. “People love those extinct things that represent a different world. Dinosaurs are just one heightened example of that.”


    Does Science Know the Vital Statistics of the Cosmos?

    1. James Glanz

    Cosmologists recently debated whether signs of a background energy in empty space point to a unified picture of the origins and makeup of the universe

    Nelson Algren, the Chicago writer, said he lived by just three rules, two of which could be listed in polite company: Never play cards with any man named “Doc” and never eat at any place called “Mom's.” Viewers of the face-off called “Great Debate III: Cosmology Solved?”—pitting the University of Chicago's Michael Turner against Princeton University's James Peebles in a crowded auditorium on 4 October—might have come up with another law, to be violated at great peril: Never debate a cosmologist whose viewgraphs are considered objets d'art.

    The artful cosmologist is Turner, whose colorful creations were once the subject of a one-man exhibition. With the help of flamboyant graphics, he argued that for the first time in history, cosmologists have a credible handle on the origin, overall makeup, and ultimate fate of the universe. His case, made to an audience of several hundred astronomers, students, and interested nonspecialists at the National Museum of Natural History in Washington, D.C., drew heavily on this year's observations of distant, exploding stars called supernovae. In Turner's optimistic view, these cosmological beacons have helped bring previously conflicting evidence into an eerie concordance. As one of the viewgraphs proclaimed in fat letters, “Cosmology solved? Quite possibly!”

    By showing that the universe is expanding at an accelerating rate rather than slowing from the force of gravity, the supernovae imply that the bulk of the universe consists not of matter but of a mysterious background energy called the cosmological constant. In Turner's picture, this energy acts as a cosmic deus ex machina, rescuing the inflationary theory of cosmic origins. This favored theory had been threatened by astronomers' inability to discover as much matter in the universe as it predicted. But now that the universe is fleshed out with pure energy, said Turner, all the pieces fall into place.

    His opponent, Peebles, the reserved author of Principles of Physical Cosmology, a book that all but defines the field, argued for caution. Although the concordance is theoretically plausible, he said, the resulting universe is chock-full of stuff too bizarre to accept without serious reservations. “This was a ‘good cop, bad cop’ deal where [Peebles] was doing his best to be pessimistic,” says Jerry Bonnell of the NASA Goddard Space Flight Center in Greenbelt, Maryland. But many astronomers are finding it hard to argue with Turner's infectious optimism. “At the moment there seems to be remarkable concordance and peace between the various observations,” says Paul Steinhardt of Princeton. The question that nags at him and most of his colleagues, he says, is whether “everything will fall apart” with further observations.

    The history of cosmology is enough to give even an ardent optimist pause. The face-off was the third commemorating a 1920 debate, held in the same auditorium, between the astronomers Harlow Shapley and Heber Curtis, who argued over the size of our galaxy and whether it comprises the entire universe or is only one island of stars among many. Much of their debate revolved around exploding and variable stars that, like the supernovae studied today, served as “standard candles”—objects thought to have a predictable brightness, so that their apparent brightness can be used as a measure of distance.

    Both participants were partly right—often for the wrong reasons—and partly wrong. And both misinterpreted the evidence from standard candles, in part because they did not appreciate how interstellar dust could dim them. “It was 40 years before the dust settled on that debate,” says Owen Gingerich, a historian of astronomy at the Harvard-Smithsonian Center for Astrophysics (CfA). The moral could be that present-day cosmology is not as solid as it appears. But for now, Gingerich concedes, “it all seems to be coming together very dramatically.”

    Turner, standing in for the late David Schramm, the equally irrepressible Chicago cosmologist who died in a plane crash last December, has been trying to stitch together a consistent picture of the universe ever since he, Steinhardt, Princeton's Jeremiah Ostriker, and a few others realized that they could connect the vast features seen in the sky today with an appealing theory of the universe's origin. According to inflationary theory, in the first fraction of a second of the big bang, the universe experienced an exponential growth surge. The newborn universe, smaller than a coconut, would have been crisscrossed by waves of quantum uncertainty, and theorists realized that inflation would have stretched those waves into the precursors of today's largest structures: the giant clusters and filaments of galaxies and the ripples on the microwave background radiation, the big bang's afterglow.

    The match between the structures we see and the quantum fluctuations in the infant universe is best if the overall density of matter in the universe is low. And a low matter density is just what astronomers have been finding as they try to “weigh” large clusters of galaxies—the biggest samples of the universe that telescopes can take in. But there's a problem: The simplest version of inflation predicts that the universe contains enough matter to make it geometrically “flat” in the four-dimensional space described by Einstein's equations of relativity. That's far more matter than the observations suggest.

    A flat universe, Turner emphasized, isn't just theoretically desirable. It also meshes with sketchy observations of the temperature ripples in the microwave background. Cosmologists can predict how big the ripples should be for a particular amount of matter and energy in the universe, so their observed size serves as a cosmic probe. And the tentative results so far do seem to show that the strongest ripples have a size of about 1 degree on the sky—the marker of a flat universe. “There it is—the baby picture of the universe, at 300,000 years old,” said Turner, showing a viewgraph with the data.

    Turner and others thought they saw a way out of the conflict: a more refined version of inflation in which reservoirs of energy in empty space itself—equivalent to matter according to Einstein's equation E = mc2—make up the matter deficit and flatten the universe. To make everything fit, however, the universe would need about twice as much energy in the cosmological constant, called lambda, as in matter. That energy would lend a “springiness” to the cosmos, counteracting gravity on large scales and causing the expansion of the universe to accelerate over billions of years. That possibility was so bizarre that few scientists could accept it.

    Then, earlier this year (Science, 30 January, p. 651, and 27 February, p. 1298), two groups announced that they had detected signs of that speedup in observations of distant supernovae. “That's the smoking-gun signature of lambda,” Turner told his listeners. “This is the first time a cosmologist has been able to stand up in front of an audience,” he added in summary, “and say, ‘I have a prima facie case for a flat universe.’”

    Peebles, in his rejoinder, compared the intense activity in cosmology over the last few years to “a really good party.” But he also listed open questions that, he said, left him with an “uneasy feeling”—a kind of cosmic katzenjammer—about whether the concordance will survive new and more precise tests. Invoking one lesson of the Curtis-Shapley debate, Peebles noted that despite astronomers' best efforts to account for dust and intrinsic differences in the supernovae (see p. 1249), something other than cosmic acceleration could still be skewing the most distant beacons. He also emphasized that no one really knows what the background energy or even the bulk of the matter in the universe could possibly be.

    Astronomers in attendance agreed that there's plenty of room for doubt. “If you have a model in which most of the matter and energy are unknown, then it's not much of a model,” said Margaret Geller of CfA. And it is a little too neat for some astronomers to swallow comfortably. “Mike Turner came close to saying that we expected” the supernova result, said Peter Garnavich of CfA, a member of one of the supernova teams. “Certainly I didn't.”

    But Garnavich, like many of his peers, acknowledges the power of Turner's picture. “It does fit,” he says. And if it holds up, says Princeton's Steinhardt, that just means there will be “new and even deeper mysteries to address”—and grist for future debates.


    No Backing Off From the Accelerating Universe

    1. James Glanz

    Intensive scrutiny of data from distant exploding stars has not shaken the conclusion that the universe contains a large-scale repulsive force

    ChicagoEarly this year, two international teams of astronomers came to an extraordinary conclusion about the nature of the universe: The unexpected dimness of distant, exploding stars called supernovae indicates that a large-scale repulsive force permeates the universe, accelerating its expansion and sweeping distant objects unexpectedly far away. Last week, a workshop* was held here to see how that conclusion is holding up in the face of new data and to give other astronomers a chance to challenge the results. But the wary posture first adopted by many researchers turned into an embrace: The teams' original conclusion not only withstood the intensive scrutiny, it actually gathered further support.

    One by one, participants took up possible confounding factors that could be making the supernovae look farther away than they are, which would mimic the large-scale repulsion generated by a background energy called the cosmological constant, or lambda. None of the possible suspects—cosmic dust, weaker explosions in the distant, early universe, or other effects—could explain away the results. “I personally very much dislike the cosmological constant,” said Mario Livio, an astrophysicist at the Space Telescope Science Institute in Baltimore. “But given the existing observations, there is a lambda.”

    While astronomers like Livio may accept the message of the supernovae reluctantly, some cosmologists welcome it, because it rounds out a consistent picture of the origin, contents, and fate of the universe (see p. 1247). But with so much riding on these stellar explosions, the pressure is on for theorists to understand just why the explosions are so uniform—and therefore why they are such apparently reliable beacons.

    By studying nearby examples of the brilliant supernovae called type Ia's, observers long ago determined that they blow up with the same absolute brightness each time, so that their apparent brightness as seen from Earth can serve as a distance measure. More recently, observers found that they could correct for small, residual differences in the explosions, because intrinsically brightersupernovae, for example, signal their presence by taking longer to light up and fade than dimmer ones do. “This is a way to ‘read the wattage of the light bulb,’” said Alexei Filippenko of the University of California (UC), Berkeley, and a member of the High-z Supernova Search Team, which is led by Brian Schmidt of the Mount Stromlo and Siding Spring Observatories in Australia.

    Reassured by these local observations, Schmidt's team and the Supernova Cosmology Project, led by Saul Perlmutter of Lawrence Berkeley National Laboratory and UC Berkeley, have been pushing far out into the universe and back in time by finding supernovae in distant galaxies, then measuring their brightness and the rate at which cosmic expansion is carrying them away from Earth. The announcements early this year were among the first results of the quest, and at the meeting, Filippenko and Perlmutter gave separate talks on the teams' latest haul of data. Perlmutter's group has now fully analyzed 42 distant supernovae with more in the pipeline, while the High-z team is now adding about a dozen new events to the 16 it has already published. The newest measurements only reinforce the earlier conclusion: The expansion of the universe has sped up since the supernovae exploded billions of years ago, by an amount implying that 70% of the universe's energy is in the form of lambda.

    “You will see that we are in remarkably violent agreement,” joked Perlmutter about the two highly competitive groups. Both speakers then described how they have tried to rule out the possibility that they are being fooled by cosmic dust, which signals its presence by reddening the supernovae—just as dust reddens the setting sun. Whether the astronomers apply a correction to the brightness of the red supernovae or simply throw them out, said Perlmutter, the conclusions do not change. The teams are pressing the search for unknown forms of “gray” dust that might shroud only distant explosions and not redden them. So far, all tests have turned up negative.

    The teams have also considered the possibility that type Ia supernovae were intrinsically dimmer in the past, perhaps because the raw materials that made up the parent star were less “polluted” with heavy elements—which gradually build up in galaxies as generations of stars are born and die. But in a joint study, the teams found that the spectra of nearby and distant supernovae match at virtually every bump and wiggle after the simple “wattage” correction, implying that the composition of near and distant supernovae is the same.

    There are other reasons to doubt that different levels of heavy-element pollution in the host galaxies are throwing off the results, said Philip Pinto of the University of Arizona, Tucson, who is not a member of either team. “We have such a rich sample of galaxies to sample from nearby,” said Pinto—some of them heavily polluted and some not. “And we don't see any differences among the local, really well-observed supernovae.”

    But theorists can't say exactly why the supernovae appear to work so beautifully as standard candles. “As the referee in this contest between the theorists and observers,” said the University of Chicago's Donald Lamb, an astrophysicist who chaired a lengthy session on explosion mechanisms, “I hereby declare that the observers have won hands down.”

    Even so, initial work by the theorists suggests it's at least plausible that type Ia's explode with such predictable fury. The explosions are thought to originate from a particular type of white dwarf star—a dim, dense, stellar cinder made mostly of carbon and oxygen. In one explosion scenario, the dwarf's gravity steadily sucks material from an ordinary companion star. When the dwarf's mass passes a particular value called the Chandrasekhar limit, it begins to contract under its own weight, heating the carbon core and igniting thermonuclear fusion there.

    That uniform mass threshold could help explain the regularity of the ensuing explosion. And other factors could push the explosions even closer to uniformity, said Ken'ichi Nomoto, reporting on work he did with Izumi Hachisu at the University of Tokyo and Mariko Kato at Keio University in Japan. Their calculations suggest that a white dwarf can steal material from a companion star and blow up only if it has a specific composition. Material ripped from the companion, he explains, tends to form a huge gas cloud around the dwarf, which could disrupt the steady accretion. The accretion can proceed only if the white dwarf sweeps away the cloud with a strong wind.

    Intense x-rays from the white dwarf drive material off the dwarf's surface and so create the wind. But the wind will be either too weak to disperse the cloud or so strong that it prevents all accretion unless the white dwarf is laced with just the right amount of x-ray-absorbing heavy elements—ensuring further uniformity. “If anything, it strengthens [the case] for a standard candle,” says Livio.

    The clinching evidence that the supernovae are telling the truth about cosmic expansion, however, could come from further observations. Both teams are reaching for more distant supernovae, which probe the expansion rate even earlier in cosmic history. In the young universe, when the same amount of matter was crammed into a smaller volume, gravity should have overwhelmed the unchanging boost of the cosmological constant, slowing the expansion. The earliest supernovae should appear relatively close and bright—an effect that no confounding factor suggested so far could create. Perlmutter's team took the lead last month, discovering what is probably the most distant supernova yet at the 10-meter Keck Telescope in Hawaii.

    Named after the composer Tomaso Albinoni (Perlmutter's team now has so many supernovae to keep track of that they have taken to naming them after composers), this event is more than 8 billion light-years away. With many more like it, the team should be able to see the brightening. So far, the musical theme fits: As far off as the searchers can find them, the supernovae all keep playing the same tune.

    • * “Type Ia Supernovae: Theory and Cosmology,” University of Chicago, 29–31 October.


    Ocean Drilling Floats Ambitious Plans for Growth

    1. Richard A. Kerr,
    2. Dennis Normile

    A proposed major expansion of the world's ocean drilling research program is taking the community into uncharted waters

    For 15 years, a vessel that looks like a cross between a freighter and an oil derrick has been roaming the oceans, boring holes in sea-floor sediments and crust. Its team of roughnecks and scientists has sampled ancient muds beneath the ice-infested waters of Antarctica and rocky crust off the Galápagos Islands. However, some of the most tempting scientific targets on the ocean floor, including unstable sediments, oil- and gas-rich regions, and the deepest reaches of the crust, have been off limits to the JOIDES Resolution and the Ocean Drilling Program (ODP), the 22-nation scientific consortium that operates it. Next year, Japan hopes to begin building a $350 million drill ship that could open up these forbidden zones. But researchers don't know if their governments will be willing to spend the extra money needed to operate that country's generous gift to the ocean drilling community.

    View this table:

    Japan's plans, expected to be approved early next year by the Diet, call for up to $40 million to start construction of a ship equipped with a riser—a pipe enclosing the drill pipe—that extends from the ship to the sea floor. Risers, which are standard on deep-sea oil platforms, allow drillers to flush heavy debris from deep holes and shore up unstable sediments. They also help provide a safeguard against blowouts when the bit penetrates oil or gas deposits. Japan intends to pick up the entire tab for building the ship, which should be completed by 2003, just when the ODP's lease on the JOIDES Resolution will end. The timing seems perfect, and many ocean drillers would welcome the riser ship's capabilities. “We've come up against these technological barriers. … We need a riser drill ship,” says ODP director Kathryn Moran.

    The problem is that most people in the ocean drilling community believe the program also needs a second ship, to replace the JOIDES Resolution, that could drill less ambitious holes in rapid succession while the riser ship concentrated on more challenging projects. And they know that the annual cost of operating two ships—roughly $130 million, or nearly three times the current $44 million budget—is steep. “The [U.S.] National Science Foundation and we are aware there has to be new money if [a two-ship program is] going to fly,” says Michael Arthur, a geochemist at Pennsylvania State University, University Park, and chair of the U.S. Science Advisory Committee to Joint Oceanographic Institutions Inc., which runs the ODP from Washington, D.C. Although no one can say how NSF and its counterparts in Europe will be able to find that new money, administrators and scientists have already set up the framework for a successor to the ODP, dubbed the Integrated Ocean Drilling Program (IODP), that assumes the use of two ships.

    There is no question that the limitations of the Resolution are hindering scientific progress. For example, Arthur notes that a recent attempt to drill into a fault in the Woodlark Basin in the western Pacific had to be stopped after a rising hydrocarbon content indicated that drillers might be approaching an oil or gas deposit. The Resolution's attempts to drill more than about 2 kilometers into the crust beneath the sediments, even in the absence of oil and gas deposits, have been foiled by jammed drill bits and poor core recovery. Nor has the ship had much luck drilling through the loose sands along continental margins, which provide a record of changing sea level.

    A riser drilling system addresses all of these problems. Risers are typically used with blowout preventers, which can prevent oil or gas from leaking into the sea. And the enclosing pipe of a riser provides a channel for drilling mud—a viscous slurry of clay, water, and chemicals—which is pumped down through the drill pipe and circulates back up in the space between the drill pipe and the surrounding riser. The dense mud helps prevent the drill hole from collapsing, lubricates the drill bit, and flushes cuttings away, in principle allowing the rig to extend as much as 7000 meters beneath the sea floor.

    Although the ocean drilling community identified the need for a riser in the early 1990s, Arthur says, the original idea was to lease such a ship from the oil industry as needed. But then the Japanese government, in particular its Science and Technology Agency (STA), stepped into the picture. STA's Marine Science and Technology Center (JAMSTEC), which is overseeing the design and construction of the vessel, saw a riser ship both as a key component of an increasing emphasis on basic research and as a project that might have technological spin-offs for Japan's shipbuilding industry (Science, 11 July 1997, p. 170). The combination was a winner politically. “The Japanese government and the [STA] are intent on completing this ship,” says Hajimu Kinoshita, JAMSTEC's director of deep-sea research.

    Scientists are already thinking up missions for the riser ship. JAMSTEC, for example, has sponsored workshops on its scientific objectives, and a mission to study the ocean floor just east of Japan has received top priority. There drilling would approach its depth limits to penetrate a fault that generates large earthquakes off Japan. Kinoshita says this target will also allow officials to keep a close eye on the vessel during its shakedown cruises. Mark Zoback, a geophysicist at Stanford University, says such drilling is “very ambitious and it's going to be expensive. … [But] we've been talking about these questions [in fault mechanics] for a long time, and they haven't been answered by indirect techniques.”

    The riser ship could also enable marine geophysicists to reach a long-sought goal: the Mohorovicvi«c discontinuity, or Moho, the presumed boundary between the crust and the mantle. Although seismic waves bounce off the Moho, geophysicists aren't sure whether it's the boundary between the crust and the underlying mantle, an intrusion of rock into either layer, or something else. “There's no alternative to drilling” to settle such issues, says geologist Henry Dick of the Woods Hole Oceanographic Institution in Massachusetts.

    In the mid-1960s, an NSF-funded project to reach the Moho, begun in 1958, became the first basic research project to be terminated by Congress after cost estimates ballooned from $5 million to $75 million. And 20 years ago NSF had equally ambitious, but ultimately frustrated, plans to convert the Central Intelligence Agency spy ship Glomar Explorer into a riser drill ship that would have cost twice as much to operate as Japan's new vessel (Science, 25 February 1983, p. 942). But in some parts of the ocean, Kinoshita says, the Moho might be within striking distance of the new ship after a planned upgrade in which the riser system is enhanced to work in water up to 4000 meters deep.

    Although the initial funds for the riser driller are included in next year's JAMSTEC budget, completion of the ship will depend on continuing appropriations over the next 5 years. Kinoshita says the government has never pulled the plug on a project that has reached this stage. But he notes that these are unprecedented times for Japan's economy, in the doldrums since the early 1990s.

    The bigger problem will be finding money to keep it at sea. At an estimated $85 million a year, the riser ship would be expensive to operate on its own, and many in the ocean drilling program are adamant that it not be the only vessel in the drilling fleet. Many paleoceanographers, for example, don't need its specialized capabilities; for their research on climate and past ocean circulation, collecting a lot of shallow sediment cores is more important. The Japanese proposal “scared everybody,” says Nicklas Pisias, a paleoceanographer at Oregon State University in Corvallis, explaining that paleoceanographers worry that a riser ship would suck up funds while drilling just a few deep holes in difficult locations.

    “To get the community behind [a riser ship], we need two ships of different design,” says Arthur—the riser and a vessel resembling the current JOIDES Resolution. Last month Pisias pointed out another motivation for the two-ship approach at the annual meeting of the Geological Society of America. “The Japanese have said they have to move forward with a riser ship,” he said. “The question is: Does the United States want to become a Third World nation” in ocean drilling by not coming up with a second ship?

    The challenge, then, is finding the money to operate both ships. Kinoshita says the current thinking is to split the operating costs evenly among Japan, the United States, and Europe plus other participating countries. He is confident that Japan will find its share of the money, although he's troubled by a government decision last year that effectively cut the operating budgets of major facilities affiliated with the Ministry of Education, Science, Sports, and Culture (Science, 1 May, p. 669) as part of a countrywide belt-tightening measure.

    However, other drilling administrators are less sure that their countries will be able to expand support. John Ludden of CRPG-CNRS in Vandoeuvre-les-Nancy, who heads the French ODP scientific committee, says that “the only way Europe can go ahead as part of an international ODP is by paying a European membership plus additional funds, say from the European Commission or from private sources” such as oil companies. Both the European Commission and oil companies would be new funding sources for ocean drilling.

    U.S. officials are taking a wait-and-see attitude. “There's a good possibility that a strong justification could be made for a multiplatform program,” says Michael Purdy, who heads NSF's division of ocean sciences and is co-chair of the international working group of the IODP. “How that would be [financially] supported is unclear.”

    Through all the uncertainty, the Japanese reiterate their commitment. Kinoshita hints that Japan might pay a bit more than one-third if that's what it takes to float the two-ship program. And Masakazu Murakami, director of STA's Ocean and Earth Division, says that a worst-case scenario would have Japan operating the ship on its own at whatever level it could afford. “But I don't like to think about that,” he says. “We will really make our best efforts to convince other countries that it is worthwhile to pay” for a two-ship program.


    Temperate Forests Gain Ground

    1. Anne Simon Moffat

    A variety of factors, including reversion of farmlands to forest and better technologies and conservation, are contributing to the gains

    Too often, the news about the world's forests is relentlessly bad. For example, both Mexico and Indonesia are losing about 1% of their forest lands every year to logging and to slash-and-burn agriculture. But although no one denies that tropical forests are in dire straits, another story is often lost in the headlines about their plight: Many woodlands outside the tropics are quietly prospering.

    One recent example comes from Iddo Wernick of Columbia University, Paul Waggoner of the Connecticut Agricultural Experiment Station in New Haven, and Jesse Ausubel of The Rockefeller University in New York City. Based on an analysis of decades of Forest Service data, these researchers conclude that forest growth in the United States has outpaced forest clearing over the last 50 years, increasing the country's total timber volume by 30%. “Forests have been reborn, despite increased tree harvests,” says Ausubel.

    Enhanced forest growth is important, because it helps soak up carbon dioxide, offsetting some of the increase due to human activity, which many experts think fosters global warming. Indeed, a recent report suggests that that may be happening (Science, 16 October, pp. 386 and 442). Forests also provide habitat for wildlife, help prevent flooding and erosion, and contribute to the production of rich soils. But Wernick, Waggoner, and Ausubel have gone beyond simply documenting the increased forest growth. Their study, which appeared earlier this year in the Journal of Industrial Ecology, identified the factors that have led to this reversal of fortune.

    Among these are reversion of marginal farmlands to forests; improvements in tree-harvesting, paper-milling, and other technologies; paper and wood recycling; and the substitution of other materials for construction lumber. The analysis, says University of British Columbia forest economist Clark Binkley, “is a realistic assessment that we can grow more trees with less land. It is not speculative. They have pulled together the facts.”

    The first widely reported evidence of revitalized forest growth outside the tropics came from Pekka Kauppi of the University of Helsinki and Kullervo Kuusela and Kari Mielikäinen at the Finnish Forest Research Institute in Helsinki (Science, 3 April 1992, p. 70). They estimated that in Europe, including European Russia, the growing stock of wood increased 25% between 1971 and 1990.

    But the increase didn't put much of a dent in the global forest decline because Europe grows only 4% or so of the world's wood. Changes in North America will have much more of an impact, because the United States and Canada grow a lot more wood, at least 13% of worldwide stocks. And the Wernick-Waggoner-Ausubel analysis shows that U.S. forests are flourishing.

    The researchers came to this conclusion by looking at data, mostly compiled by the U.S. Forest Service, dealing with the amount of wood harvested, sizes and densities of forests, and other indicators of forest use. These figures revealed that while some areas have seen declines—between 1952 and 1992, wood stocks in the Pacific Northwest went down by 20%, for example—during that same period, wood stocks almost doubled in other areas of the northern United States, including the Great Lakes region, New England, and New York, where abandoned farmlands have reverted to forests. They also went up by 70% in the American South, where softwood plantations now thrive on soils degraded by decades of cotton farming, and by 18% in the Rocky Mountains.

    Logging didn't erase the gains, even though consumption of all timber products grew 70% between 1900 and 1993, because existing woodlands are producing more efficiently, harvest data show. This has been achieved by better spacing of trees in plantations, planting rapidly growing species and harvesting them on a shorter life cycle, and taking advantage of early, faster growth.

    Various conservation strategies also seem to be paying off. For example, much of the previously wasted wood residues, such as twigs and branches, sawdust, and wood chips, that once cluttered the forest floor are now gathered and fabricated into fiberboard, veneers, and insulation. Also, better cutting blades in sawmills have increased lumber recovery by 5% to 10%. Without these savings, every year U.S. mills would need 120 million more cubic meters of hardwood, more than all the wood taken in 1993 from Alaska, California, Oregon, and Washington.

    Other savings have come from recycling paper and other wood products, which increased by 150% between 1970 and 1993. And throughout the 20th century, Americans have been substituting for wood-based products, for example, gathering their news and information from electronic media, instead of from printed pages, and building their homes and businesses largely from plastic, glass, steel, and concrete, instead of from wood.

    In the future, though, the savings achieved by such conservation measures are likely to be counterbalanced by continued population growth, coupled with increasing wealth. “Right now, we are growing more [trees] than we are cutting,” says forester Robert Hagler, who heads his own consulting firm in Reston, Virginia. But, he adds, that situation is likely to reverse, especially for conifers, which are in high demand for construction and paper.

    But, as the report points out, future measures could foster forest growth. Particularly promising is the application of modern agriculture to forestry. Peter Ince of the U.S. Forest Service in Madison, Wisconsin, says, “We are nowhere near the maximum biological productivity of trees. We've done a lot with agricultural crops, and that's where we'll wind up with trees.” Such efforts might include breeding trees for rapid growth, as well as the application of genetic engineering to promote resistance to pests or herbicides or to endow trees with special traits, such as low-lignin content, valued for papermaking (Science, 9 February 1996, p. 760). Another possibility is to develop better fiber crops, such as kenaf and hemp, that can substitute for trees as the raw products for paper and pulp mills.

    Indeed, Ausubel sees opportunities all around. He says that a modest, 1% annual increase in forest growth compounded by steady or slightly reduced demand “would shrink the extent of logging [in the U.S.] by one-half in 50 years,” a prediction likely to please, and amaze, many environmentalists.

Stay Connected to Science