Ethicists Back Stem Cell Research, White House Treads Cautiously
- Eliot Marshall
Research on a scientifically promising type of human cell received a vote of confidence last week. In a decision that Stanford University biologist Paul Berg calls “gutsy,” the National Bioethics Advisory Commission (NBAC) recommended on 14 July that the federal government fund not only research on human embryonic stem cells but also the production of cell cultures—even if it means sacrificing embryos. In an official notice, NBAC, a presidential panel of 17 scholars and ethicists, says it will deliver its final report (not yet finished) to the president “very soon.”
The president's staff, however, didn't wait long to distance itself. On the same day that NBAC reached its decision, the White House released a note saying that the Administration's policy will be spelled out in guidelines being drawn up by the National Institutes of Health (NIH). It noted that the president has ruled out funding the creation of human embryos for research. Congress has gone further, for several years adding a general ban on funding embryo research to the NIH appropriation; the current law runs through 30 September. The White House concluded: “No other legal actions are necessary at this time because it appears that human embryonic stem cells will be available from the private sector,” and research on these cells “is permissible under the current congressional ban.”
With this, the Administration formally backed a policy adopted by NIH and the Department of Health and Human Services (HHS) in January. Their legal experts ruled that government funds may be spent to study, but not to derive stem cells from, embryos (Science, 22 January, p. 465). Under this policy, only private labs may develop human stem cells from embryos, but NIH-funded and other U.S.-backed researchers may use them. (The restriction on development doesn't apply to fetal tissue.) Even this plan is controversial, however, because some right-to-life activists have declared that any destructive use of an embryo is immoral.
Despite the legal clouds, many biomedical researchers think this field has a bright future. They say that stem cells derived from human embryos and fetal gonadal cells, which are capable of developing into a wide variety of specialized cells, may be a valuable source of transplant tissue. For the past 9 months, NBAC has been deliberating over ethical guidelines that might enable more rapid development of this biomedical technology by opening it to public funding. NBAC concluded that the potential benefits of stem cell research outweigh the disadvantages—provided the cells are drawn from embryos that would otherwise be discarded. NBAC recommended that only “spare” embryos from fertility clinics be used, and only when both donors give full consent. NBAC also said that the government should establish a watchdog committee to set ethical rules and enforce them.
NBAC's position could have an impact on debates on biomedical funding in Congress this summer and fall if right-to-life advocates seek to extend the congressional ban on embryo research and apply it explicitly to the derivation or use of embryonic stem cells. Representative John Porter, chair of the House appropriations subcommittee for Labor and HHS, which drafts the NIH budget, has tentatively set a meeting on 21 July to begin work on next year's appropriation. Although Porter told Science that he personally supports NIH's perspective, he said he didn't want this discussion to “overwhelm the funding process.” Porter said: “I told [NIH director Harold] Varmus that we should fight the issue on the intellectual basis of what will happen with or without this kind of research. … But I don't want to see NIH's funding wrapped up in an argument that to me is tangential.”
Other members of Porter's subcommittee —including Representative Jay Dickey (R-AR), a sponsor of the embryo research ban—reject both NBAC's view and the Administration's compromise position. “We believe that science should serve humans, not that humans should serve science,” says Dickey. He does not think the current law permits federal research on embryonic stem cells, and says he will help take the fight to court, if necessary. He hasn't proposed any change in the embryo research ban.
But Berg, a spokesperson for the American Society for Cell Biology, says NBAC has developed a position that he hopes will make sense to scientists and the public. He calls NBAC's recommendations for monitoring the field “bureaucratic,” but reasonable if they reassure the public that this research will be guided by ethical principles.
- RESEARCH FUNDING
Michigan Plans Massive Investment in Biotech
- Jocelyn Kaiser
In what may be the largest windfall for research from a state tobacco settlement so far, Michigan Governor John Engler this week signed a bill allocating a stunning $1 billion over the next 20 years for a competitive biotechnology research fund for his state's scientists. The fund, to focus on aging and health, may be spent on a range of programs, from research grants for diabetes to building new bioinformatics databases.
Michigan leaders hope their plan for a “life sciences corridor” centered around the state's research universities will put the area on the map as a powerhouse comparable to, say, Research Triangle Park in North Carolina. The plan makes use of $8.5 billion Michigan will receive from the settlement last November with tobacco companies, under which 46 states will recover the costs of treating tobacco-related illnesses. The only comparable program is a 10-year-old research fund in California financed by cigarette taxes that allocates about $20 million a year to state scientists. Some other states have modest research programs on the drawing board.
The Michigan Health and Aging Research and Development Initiative—its official name—was proposed by the presidents of Michigan State University, the University of Michigan, and Wayne State University, as well as the Van Andel Institute for Medical Research and Education in Grand Rapids, founded by the family that owns the Amway company. The governor will appoint members of those institutions to serve on the steering committee, and awards will be made on a competitive basis to scientists in Michigan. Although the budget for 2000 has already been set at $50 million, subsequent budgets will be approved by the legislature.
The specific agenda has not yet been decided but is governed by legislation that calls for spending 40% on basic research, 50% on applied collaborative projects, and up to 10% on commercial development. Robert Huggett, research vice president at Michigan State, says “we want to focus at the molecular genetic level” on topics that might range from neuroscience to diabetes. The initiative's planners have talked about using some money “to bring in world-class scientists in a few critical areas” and build infrastructure such as nuclear magnetic resonance facilities shared among universities, says James B. Wyngaarden, former director of the National Institutes of Health and a member of the Washington Advisory Group, a consulting group in Washington, D.C., that helped design the initiative. But “there's an awful lot of planning to be done,” he adds.
Not everyone is happy with Michigan's plan, which would spend most of the rest of its settlement on college scholarships. Antitobacco advocates are disappointed that none of the money will go for smoking prevention programs. Michigan “doesn't have anything approaching a comprehensive tobacco prevention program,” asserts Joel Spivak of the Campaign for Tobacco-Free Kids.
But scientists in Michigan are giddy with anticipation. “This is a great story and an exciting day for Michigan,” says George Van de Woude of the National Cancer Institute, who as of October will be the new research director of the Van Andel Institute.
Why the Ice Ages Don't Keep Time
- Richard A. Kerr
According to the textbook theory, the ice ages and much other climate change should unfold with clocklike precision. In this astronomical or Milankovitch theory, the pacemaker for climate cycles over tens to hundreds of thousands of years is the rhythmic nodding and gyroscope-like wobbling of Earth's spin axis and the periodic stretching of its orbit, all of which change climate by redistributing sunlight across the planet. These cycles can be calculated millions of years into the past, and ice cores and other climate records do reveal climate swinging from one extreme to another with pendulum-like precision in time with some orbital forcings. But other climate changes, including the ice ages themselves, don't quite keep time, raising doubts about how much the astronomical theory can explain (Science, 8 May 1998, pp. 828 and 874). In this issue of Science, two studies attempt to tidy up some of these loose ends and reassert the power of the astronomical climate clock.
On page 564, geophysicist José Rial of the University of North Carolina, Chapel Hill, explains why the timing of the ice ages seems to be off by invoking an interaction between orbital forcings that resembles the way FM radio signals are generated. The work “show[s] that the Milankovitch frequencies really are deep in the [climate] data set,” says geophysicist Jeffrey Park of Yale University. And on page 568, geographer- plant ecologist Katherine Willis of Oxford University and her University of Cambridge colleagues present a new record of the start of the ice ages 2.75 million years ago suggesting that an intensification of one orbital cycle may have triggered a surprisingly abrupt drop into northern glaciation.
Although cycles of 23,000 and 41,000 years in the climate record do match up precisely with Earth's wobbling and nodding, the ice ages don't keep to the 100,000-year schedule that should be set by the periodic elongation of Earth's orbit. Some ice ages have come as much as 120,000 years apart; other cycles have been as short as 80,000 years, Rial says. But he noticed an underlying regularity: A complete cycle from quicker, higher frequency cycles to slower, lower frequency cycles took about 400,000 years, suggesting that yet another astronomical cycle—a second, 413,000-year cycle in orbital elongation superimposed on the shorter cycle—might be modulating the frequency of the 100,000-year cycle the way broadcasters “frequency modulate” a carrier signal to produce FM radio broadcasts.
To test his idea, Rial calculated how a 413,000-year signal should modulate a 100,000-year one and checked the climate record to see how it matched the simulation. He found that the frequency of the 100,000-year cycle has risen and fallen in time with the longer modulating cycle, matching the calculation. As in an FM broadcast, the modulating signal itself failed to show up in the climate record but left its fingerprints in pairs of “sideband” signals that have frequencies just above and below the carrier frequency. Rial found exactly the predicted pattern of sidebands in the climate record, including a prominent 107,000-year oscillation. Frequency modulation “changes periodically the duration of the ice ages,” says Rial. “It's a pretty idea.”
He can't point to a particular physical mechanism that would translate the 413,000-year cycle into a lengthening and shortening of the 100,000-year cycle, although he says that the longer cycle of sunlight changes may interact with an oscillating part of the climate system, such as ice sheets. Even so, other climate specialists are taken with the frequency-modulation idea. “I like very much the ideas of Rial,” says paleoclimatologist André Berger of the Catholic University of Louvain in Belgium. “Orbital forcing is certainly the pacemaker.” The fit between the predicted pattern of oscillations and the climate record is “intriguing,” says geodynamicist Bruce Bills of the Scripps Institution of Oceanography in La Jolla, California, but “it would be even better if you could point to an obvious physical mechanism that would explain why the system works that way.”
How the glaciation in the north got started in the first place 2.75 million years ago is another enigma. Earth had been cooling for 50 million years, perhaps because waning carbon dioxide was reducing the atmosphere's greenhouse effect—although that idea has recently been questioned (Science, 11 June, p. 1743). Another push toward glaciation could have come when the Isthmus of Panama closed about 4.5 million years ago, shutting the passageway between the Atlantic and Pacific oceans and redirecting warm ocean currents into the North Atlantic. That would have increased the supply of moisture to high latitudes and hence fostered the snowfalls that built the ice sheets. But sizable ice sheets still failed to form in the north for another 2 million years, suggesting that at least one more factor was still missing. Candidates have included a surge in North Pacific volcanism (Science, 10 January 1997, p. 161), whose airborne debris would have further cooled climate, and a change in Earth's nodding.
Now a hint of an astronomical trigger for Northern Hemisphere glaciation has turned up beneath a field of sunflowers in central Hungary: a high-frequency climatic “buzz” apparently excited by Earth's orbital wobbling. The sunflowers grow over the bottom sediments of a now-vanished lake, where Willis and her colleagues retrieved a 320,000-year climate record spanning the onset of glaciation. It is largely made up of annual layers created by minerals that precipitated out of the lake in summer, alternating with wintertime algal blooms. So far, the researchers have sampled the core at 2500-year intervals, extracting pollen whose species composition varies as the climate changes. They found an abrupt increase in pollen from plants of the cold, boreal forest that began 2.75 million years ago, the same time that marine sediment isotope records show ice sheet formation accelerating.
The pollen also shows short warmings and coolings lasting just 5000 to 15,000 years. Such cycles, also known from other records, are shorter than any astronomical cycle, and climate researchers think some may be overtones of Milankovitch oscillations created in the climate system, like the squeaking of an overblown clarinet (Science, 14 January 1994, p. 174). In the lake record, the buzz intensifies 2.75 million years ago, when the orbital wobbling intensified. That's just when the boreal forest raced southward and the ice sheets swelled; Willis thinks the intensified buzz could have been the trigger. She suggests that the quick bursts of cold could have fostered ice buildup, while the intervening warm periods would have been too short to melt all the ice.
Berger and others are impressed with the detailed view of climate afforded by the Hungarian lake core. “They clearly see sub-Milankovitch [climate] periodicities,” says Berger, but he says the connection between Milankovitch forcing, the climate buzz, and the onset of glaciation is not yet so clear. The answer may still lie in a closer look beneath the sunflowers.
DOE to Review Nuclear Grant
- David Malakoff
The U.S. Department of Energy (DOE) is reconsidering a grant that critics say will fund “cold fusion” experiments. DOE officials this week announced that a special review panel will take a fresh look at the science underpinning the $100,000 project, which proposes to test a new method of transforming radioactive waste into harmless byproducts. The restudy represents a potentially embarrassing stumble for DOE's new $19 million Nuclear Energy Research Initiative (NERI), which DOE officials pledged would use top-notch external reviewers to pick the best projects (Science, 11 December 1998, p. 1980).
The grant, to George Miley, a nuclear engineer at the University of Illinois, Urbana-Champaign, is intended to fund tabletop experiments to test the feasibility of treating nuclear waste using low electric fields and thin metallic films to produce “low-energy nuclear reactions.” It's one of 45 awards, chosen from among 308 proposals and announced in May, for studies into everything from lightweight reactors to new radioactive waste cleanup technologies.
In an abstract (neri.ne.doe.gov/awardlist.html), Miley noted that preliminary experiments in which nickel, palladium, and titanium films were “highly loaded with protons” and then energized with electricity had produced reactions that appeared to transmute radioactive elements into safer byproducts and produce “excess energy.” The approach, he told Science, “was motivated by a swimming electron theory,” which suggests that high electron densities on the films can aid nuclear reactions. Further trials, he wrote, were needed to nail down “this breakthrough science.” In particular, he requested funds to refine materials and to perform analyses designed to make sure the byproducts were produced by the reactions and not by accidental contamination.
The project's apparent similarity to controversial cold fusion experiments—which have unsuccessfully sought to use electrochemical reactions to spark energy-producing nuclear fusion at room temperature—raised eyebrows both within and outside DOE. An official at DOE's Germantown, Maryland, office first raised questions about the project in early June, according to NERI program manager John Herczeg. DOE officials decided that Miley's proposal should have been handled by the agency's Office of Science, which arranged reviews of NERI's basic research proposals, and not by the Office of Nuclear Energy, which oversaw the program's engineering grants. In late June, nuclear office chief Bill Magwood asked the science office to look at the grant, for which funds had not been disbursed. That office is recruiting three reviewers, who are expected to issue their opinion next month.
One group, however, says DOE should act immediately. “The credibility of DOE will be irreparably damaged unless funding for this cold fusion proposal is immediately withdrawn,” Edwin Lyman, scientific director of the Nuclear Control Institute, a Washington-based arms control group, wrote in a 6 July letter to Energy Secretary Bill Richardson. The award, he told Science, “raises questions about the adequacy of DOE's peer review … the whole [NERI] project needs to be looked at under a microscope.” DOE officials, however, say that Miley's grant is the only NERI award scheduled for further scrutiny.
Miley says the turnabout “came as a complete shock.” The proposal “is speculative but based on extensive experimental data,” he says. And although his work has been identified as cold fusion, he say it is “radically different—we have trouble getting the cold fusion people to understand what we are doing.” The difference, he says, is that whereas cold fusion experiments focus on fusing deuterium atoms, his work involves proton-metal reactions. He is also worried about the fate of three graduate students in his lab if DOE rescinds the award.
The flap could also jeopardize NERI's future. Despite backing from White House advisory panels and several well-placed lawmakers—including Senate Budget Committee chair Pete Domenici (R-NM)—DOE has had trouble building political support for its nuclear energy science budget, which Congress zeroed out in 1997 due to concerns about quality and other issues. NERI's commitment to peer review helped reverse the tide last year, and program officials were hoping for a $6 million increase to $25 million next year. But “the idea that DOE is spending money on questionable science could renew the doubts,” says one Senate aide. Whether or not the grant is canceled, he says, the episode “will prompt a lot of questions.”
- CIRCADIAN RHYTHMS
CRY's Clock Role Differs in Mice, Flies
- Marcia Barinaga
A clock would be useless without a way to set it, and that's certainly true for the circadian clock that controls our daily biological rhythms. Several research teams reported last fall that the light-absorbing protein cryptochrome (CRY) seems to fill that role in plants and flies, synchronizing the clock to the 24-hour light-dark cycle. But research in mice raised the possibility that, in mammals, CRY might be a cog in the clockworks itself rather than the light receptor. Now two papers, one in this issue of Science and the other in this week's Cell, show how CRY interacts with the mouse and fly clocks, confirming that its roles in the two clocks are quite distinct.
In flies, Steve Kay's team at The Scripps Research Institute in La Jolla, California, reports on page 553, light triggers CRY to reset the clock by interacting directly with a clock protein called TIMELESS. But in mice, Steven Reppert of Harvard Medical School in Boston and his colleagues report in Cell, CRY is part of a group of proteins that make up the central clock mechanism and may not be a light receptor at all. “This is a clear difference between flies and mammals,” says clock researcher Joseph Takahashi of Northwestern University in Evanston, Illinois. And that, Takahashi points out, has become a recurring theme in animal clocks, which use the same cast of proteins but often in different roles.
At the core of the fly's clock are two proteins, PERIOD (PER) and TIMELESS (TIM), whose levels rise and then fall over the course of a day. This oscillation is caused by a feedback loop in which PER and TIM accumulate, then team up to turn off their own genes. That causes PER and TIM levels to drop until they can no longer repress their genes, and the cycle starts again. Light can reset the clock by turning the genes on prematurely, and 3 years ago clock researchers found that it does this by inactivating TIM.
How that happens was a mystery until last fall, when Jeff Hall and Michael Rosbash of Brandeis University in Waltham, Massachusetts, along with Kay, found that light can't inactivate TIM in flies with a mutant cry gene (Science, 27 November 1998, p. 1628). That meant CRY is part of the light-resetting pathway, possibly the light receptor itself. But it didn't explain how CRY affects TIM.
Now Kay's team has provided that explanation. They began with cultured fruit fly cells engineered to make PER and TIM but not CRY. The cells don't have a running clock, but the researchers follow PER and TIM activity in the cells by measuring their repression of a gene that contains the same control region as the tim gene. Light has no effect on that repression, so the researchers decided to add CRY to the system, Kay says, to see if that would “at least partially reconstitute the photoreception events.” It did: Adding CRY and exposing the system to light put an end to the gene repression by PER and TIM.
To see what protein CRY was acting on, the researchers then used antibodies to pull CRY from the cells. It emerged as part of a protein complex with TIM. That suggested, but did not prove, that CRY acts directly on TIM. Proof came when Kay teamed up with Charles Weitz at Harvard Medical School. Weitz's team performed a test called the two-hybrid assay, which uses yeast cells designed to turn blue when two foreign proteins—CRY and TIM in this case—bind to each other. The yeast test confirmed that CRY does directly bind TIM, and what's more, the pairing requires light. That shows that “cryptochrome actually is a photoreceptor that directly touches TIMELESS and sequesters it,” pulling it out of action and thus resetting the fruit fly clock, says clock researcher Paul Hardin of the University of Houston in Texas.
As it turns out, things work differently in mice. Last fall, a team led by Aziz Sancar of the University of North Carolina, Chapel Hill, reported that mice missing one of their two cry genes have clocks with abnormal light responses. That suggested that CRY is a circadian photoreceptor. But the clocks of those mice also ran abnormally in the dark, convincing some researchers that CRY is a central component, rather than a light sensor, in the mouse clock. That idea got a boost in April, when Jan Hoeijmakers's team at Erasmus University in Rotterdam, Netherlands, showed that mice missing both cry genes have no clock at all (Science, 16 April, p. 421).
That report spurred Reppert to look at CRY's clock function in mice. The mouse clock, like the fly's, depends on a protein feedback loop, but in mice the PER proteins seem to enter the nucleus and shut off their genes without help from TIM. When Reppert's team tried to reconstitute the clock by putting active per genes into cultured fibroblast cells, however, the PER proteins did not move completely into the nucleus, nor did they completely shut off the genes. “Our cell culture assay was missing something,” says Reppert.
When Kazuhiko Kume, a visiting scientist from the University of Tokyo, tried putting CRY into the cells, they were “blown away” by the result, Reppert recalls. CRY was the missing element: The gene inhibition that had been partial now was complete. In subsequent experiments, the team showed that CRY forms a protein complex with PER and helps it move into the nucleus, where CRY and PER turn off not only the per genes but the cry genes as well. The fact that CRY is “required to get the repression you need in the feedback loop” pegs it as a central clock component, says clock researcher Carla Green of the University of Virginia, Charlottesville.
CRY's shift in roles from a stimulator of gene expression in the fly to a repressor in mice is a fascinating evolutionary twist, says clock researcher Michael Young of The Rockefeller University in New York City. In the same evolutionary span, the protein seems to have lost its role as a photoreceptor as well, a function it maintains in organisms as diverse as plants and flies. It is too soon, Takahashi says, to say for certain that CRY is not, in addition to being part of the clockworks, also a photoreceptor for mammalian clocks, but “right now all the evidence suggests that it's not.”
Faculty Protest Proposed Reform
- Michael Baker*
SEOUL—Street protests by thousands of university professors have led the Korean government to modify an ambitious plan to enhance research and strengthen graduate education. Although officials say the changes are minor, some supporters of the government's plan worry that the modifications will severely undermine its goals.
The plan, announced this spring and called Brain Korea 21 (Science, 18 June, p. 1902), calls for spending $1.2 billion over 7 years on a handful of strategic fields, including biotechnology and materials science, as well as the traditional disciplines of biology, chemistry, and physics. The money would go to universities that have pledged to break down departmental barriers and reduce cronyism, as well as up-and-coming regional institutions. Some grants are targeted for newly formed groups from consortia of institutions that agree to cut the number of undergraduate students, expand their graduate programs, diversify admissions criteria, and establish a performance-based pay system for professors.
But last month in Pusan, 1000 university professors, mostly from the humanities and social sciences, carried signs and chanted slogans declaring that the reform—and the increased funding—bypasses their fields and could actually increase the concentration of resources at elite schools. On 8 July the protest was repeated in Seoul with double the number of disaffected faculty.
That pressure has led the government to open the door to proposals from outside the natural sciences and to scrap a plan to adopt a performance-based pay system. A typical professor at a national university with 2 decades of experience earns just under $3000 a month (private universities pay about 50% more), and some saw the proposed merit system simply as a way to reduce their pay. Officials at the Ministry of Education, which designed BK21, say that the natural sciences component is going ahead on schedule.
The concessions may have resulted from the weak position of a politically troubled government hit by several recent scandals. “The program is good for the universities. … But the simple story is that the government is politically unable to do it,” says Chung Sung Chul, director of the Science and Technology Policy Institute in Seoul. An editorial in the English-language Korea Herald says that eliminating the proposed merit-pay system invalidates the plan, which takes aim at “the poor research records of professors and a closed recruitment system.” Without those changes, the newspaper says, “the BK21 is simply a waste of tax money.”
Most scientists still back the plan, however, because it promises strong support for high-quality research. Last week Lim Jeong Bin, a biology professor at Seoul National University, was preparing his group to meet a 20 July deadline for BK21 applications. He remains optimistic that the plan will not unravel. “I think the protests will subside,” he says, and the government will move ahead.
Mapping Smells in the Brain
- Marcia Barinaga
A whiff of perfume or the smell of wood smoke may dredge up complex memories, but every smell starts as a simple code. Now, a team at Duke University Medical Center in Durham, North Carolina, has developed a powerful new tool for reading the brain's smell code.
Each sensory system has a code for the information it receives. For example, hearing uses a frequency code, while the olfactory system encodes odors by chemical composition. There are over 1000 different olfactory receptor proteins found on neurons in the nose, each of which recognizes a particular chemical feature of some odor molecules. The neurons send their signals to the brain's olfactory bulb, where each of thousands of little clusters of neurons called glomeruli receives input from olfactory neurons with just one receptor type. That means each smell should activate a unique pattern of glomeruli—the “code” for that smell.
Researchers want to know how the brain uses that code to process olfactory information further, and now Duke neuroscientist Lawrence Katz and graduate student Benjamin Rubin have developed an essential tool for doing so. In the July issue of Neuron they report that they have used an optical imaging technique to see the patterns of glomeruli that respond to particular odors in rat brains—the first time that's been done in living mammals.
“This is really a breakthrough,” says Randolf Menzel of the Free University of Berlin, who studies olfaction in honeybees. He and others note that because the olfactory system is so well characterized molecularly and structurally, the technique should offer neurobiologists a rare opportunity to examine and manipulate the ways the brain processes specific sensory information.
Katz and Rubin decided to try a technique on the olfactory bulb that had been used for years on the visual system. Developed by Amiram Grinvald of the Weizmann Institute of Science in Rehovot, Israel, the method, called intrinsic signal imaging, involves shining light on a patch of brain surface of a living animal. An analysis of the light bouncing back can reveal changes in blood oxygenation (via changes in light absorption by hemoglobin) or changes in the light-scattering properties of neural membranes, both of which reflect changes in neural activity.
Rubin tried the technique on rats, removing or thinning the part of the skull lying over their olfactory bulbs, then measuring the pattern of optical signals in the bulbs when the anesthetized animals were exposed to different odors. The technique worked beautifully, says Katz, with a resolution “10-fold better than in the visual system,” enabling Rubin to clearly visualize individual glomeruli. Each odor produced a unique pattern of active glomeruli.
The optical imaging is a vast improvement over earlier methods, which entailed exposing a rat to an odor for 45 minutes (an unnaturally long time), then killing it and looking for changes in the uptake by the olfactory bulb of a labeled form of glucose, which also indicates neuronal activity. That approach can test only one odorant per animal, and, Menzel adds, “one never knows whether the neuronal … code might not change” under such long stimulation. Katz and Rubin, he says, “used stimulation which is rather natural” in concentration and timing.
That advantage, coupled with the high resolution and the flexibility of being able to expose a single animal to many odors at different concentrations and under various conditions, is what has researchers so excited. What's more, the imaging can be used to guide other techniques. For example, once researchers identify the glomeruli that respond to a particular odorant in a living animal, Katz says, it is “not that difficult” to use electrodes to examine how the glomeruli interact, enabling researchers to check the hypothesis that active glomeruli turn up the contrast in their signal by inhibiting the responses of their neighbors.
Olfaction is also “perfect for looking at learning and memory,” Katz says, “because one thing rodents learn very well is odors.” He and others are eager to ask how the glomerular code for an odor may change if the rat learns to associate a smell with, say, food, something Menzel has already shown to be the case in honeybees. The possibilities don't stop there.
Katz's team now has the technique working in mice, and because the mouse odorant receptors have been cloned, researchers can use genetic engineering to generate receptor molecules tagged with a fluorescent protein, enabling them to associate specific glomeruli with specific receptors, or even genetically change the receptors or their neurons to see how that affects olfactory processing. What's more, optical imaging can likely be done on higher olfactory processing areas in the cerebral cortex, where smells may interact with other perceptions or memories, to ask how the patterns from the olfactory bulb are translated and transformed in those areas.
Indeed, says Grinvald, the possibilities opened by Rubin and Katz's result are already drawing new participants into the field of olfaction. “I know of two very good groups that jumped on this project as soon as they heard that the imaging is working so well,” he says. Others are bound to follow.
- INFECTIOUS DISEASES
Gene Sequencers Target Malaria Mosquito
- Michael Balter
A group of insect geneticists, genome researchers, and funding officials has put together a plan to open a new front in the war against malaria: the sequencing of the genome of Anopheles gambiae, the mosquito primarily responsible for spreading the disease in Africa. “Anopheles would be the first insect disease vector to be sequenced,” says Carlos Morel, director of the United Nations' Special Program for Research and Training in Tropical Diseases, which hosted a meeting earlier this month in Geneva to discuss strategy. The participants will submit proposals to major biomedical agencies in Europe and the United States to fund the project, which would take an estimated 5 years and cost between $50 million and $90 million.
Malaria researchers say that sequencing the mosquito's genome—which, at 260 million base pairs, is about the size of one large human chromosome—should lead to a better understanding of interactions between the insect and the parasite. “Having the genome sequence would be fantastic,” says molecular entomologist Robert Saunders of the University of Dundee in the United Kingdom.
Malaria kills more than 1 million people worldwide each year, and an estimated 86% of those deaths occur in Africa, where it is the second leading cause of mortality after AIDS (Science, 14 May, p. 1101). The disease is caused by the protozoan parasite Plasmodium, which infects red blood cells and causes them to burst when progeny parasites are released. Since 1995, an international team of researchers has been sequencing the genome of Plasmodium falciparum, the species responsible for the most serious form of malaria. The proposed Anopheles sequencing project would complement this work.
To enter its human host, Plasmodium needs the help of the Anopheles mosquito, which injects the parasite into the host's bloodstream while it ingests blood. Some strains of Anopheles, however, are resistant to the parasite, mounting an immune response that kills off the protozoan before it can mature. What some researchers call the “Holy Grail” of malaria control would be to create a genetically modified mosquito incapable of transmitting Plasmodium, an aim that would be greatly aided by knowing the sequence of the mosquito's genome.
Although the project's supporters have yet to raise the funding, they were encouraged by the fact that the meeting was attended by emissaries from major genome research agencies as well as leading gene sequencing centers. “I'm not really worried” about getting the money, says Fotis Kafatos, director of the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, who initiated the project together with Anopheles expert Frank Collins of the University of Notre Dame in Indiana. The participants included representatives of NIAID—the infectious disease institute of the U.S. National Institutes of Health (NIH)—and the Wellcome Trust, Britain's mammoth biomedical research charity. Also attending were gene sequencing jockeys from the Wellcome Trust-funded Sanger Centre near Cambridge, France's Genoscope, and The Institute for Genomic Research in the United States.
Kafatos points out that a lot of research has already been done on the Anopheles genome, including genetic mapping and preliminary sequencing at Genoscope, EMBL, the University of Iowa, and other centers. “We have really already started, and as the money comes in that will determine how fast we go,” Kafatos says. NIH has already said it will consider grants of up to $1.5 million per year for at least the first 2 years of the project. Wellcome director Michael Dexter told Science that although there is “excitement” at the trust about the proposal, funding decisions will have to wait until a replacement is found for outgoing Sanger director John Sulston.
Anopheles researchers argue that recent advances in efforts to create transgenic mosquitoes have added urgency to the plan. Mosquitoes have long proved awkward to modify genetically, in part because their eggs are hard and difficult to inject with foreign DNA. Last year, however, two research groups, one led by Collins and the other by Anthony James of the University of California, Irvine, succeeded in injecting foreign genes into embryos of Aedes aegypti, the mosquito vector for the viruses that cause yellow fever and dengue fever. This raised the hope that Anopheles might be similarly modified. “People will say that it's science fiction until the day you do it,” comments Morel. “The human genome is already being sequenced, and so is Plasmodium. Once we have Anopheles, we will have all three actors in the malaria cycle.”
X-ray Crystallography Without Crystals
- Robert F. Service
For much of his career, David Sayre has been seeing spots and doing everything he can to get rid of them. Sayre, an x-ray crystallographer now retired from IBM, makes images of materials using x-rays, which can reveal fine detail down to the arrangement of atoms in a molecule. But this ultrahigh-resolution imaging technique only works on crystals, in which many copies of a molecule are lined up in a regular array. When x-rays are targeted at such a crystal, they bounce off the atoms and interact to produce a set of diffraction spots, which researchers can mathematically reconstruct into an image of the molecule. Now Sayre and colleagues in the United States and the United Kingdom have done away with the need to form molecules into a crystal and diffract x-rays into spots. In this week's issue of Nature they report creating the first diffraction image from a noncrystalline sample, a feat that could revolutionize the imaging of the vast array of materials that cannot be crystallized, providing ultrahigh-resolution images of everything from cells to individual protein molecules.
“It's really a brilliant experimental achievement,” says Eaton Lattman, an x-ray crystallographer at The Johns Hopkins University in Baltimore. Sayre and his colleagues—Jianwei Miao and Janos Kirz at the State University of New York (SUNY), Stony Brook, and Pambos Charalambous at Kings College in London—used their new technique to produce images of an array of tiny gold dots with a resolution of 75 nanometers. That doesn't match the resolution available from crystalline samples, which can be hundreds of times finer, but it's already better than the best optical microscopes. And Miao told Science the team has already improved the resolution to about 65 nanometers and expects to do considerably better.
The technique is an outgrowth of conventional x-ray diffraction, which requires knowing two properties of the diffracted x-rays to make an image. The first is the intensity of the diffraction spots—easily determined with a photon counter. The second property is the relative timing of the waveforms of the x-rays, known as their phase. Figuring out the phase is more troublesome, traditionally requiring researchers to compare the diffraction pattern from a pure crystal with one from a similar crystal in which heavy metal atoms substitute for some components of the crystal. The signals from the metal atoms provide reference points from which the phase of the other x-rays can be worked out.
That's all well and good for working with orderly crystals. But with noncrystalline samples, x-rays don't produce the clear diffraction patterns studded with sharp and isolated spots. Instead, they generate splotchy patterns. The key is that in these splotches, the intensity varies smoothly from one pixel in the diffraction image to the next in a manner related to the phase. In the early 1980s, other researchers suggested that it might be possible to use that information to work out the phase of x-rays diffracted from such samples. So, the SUNY-Kings College team created an algorithm that is designed to extract an image from this fuzzy diffraction data by first making a wild guess, assessing its accuracy, making adjustments, and then repeatedly trying again.
The program starts with the intensity data in the splotchy diffraction pattern and combines this with random phase information generated by the computer to churn out an approximate image of the target responsible for the diffraction. It then adjusts this image by comparing it to a set of known mathematical constraints. Next, it reconverts the revised image back into the corresponding diffraction intensity data and phase information. It combines the new phase information with the original intensity data to generate a new picture. Repeating this cycle about 1000 times, the computer homes in on a final image.
The new algorithm is designed to work with low-energy, “soft” x-rays, which are ideal for imaging biological materials. Such samples vary greatly in the amount of soft x-ray photons they diffract at different wavelengths, says Ian Robinson, a physicist at the University of Illinois, Urbana-Champaign. So, researchers should be able to create high-resolution, high-contrast composite images of cells by combining separate images taken at different soft x-ray wavelengths, he says. Down the road, adds Louise Johnson, a biochemist at Oxford University, the technique could make it possible to image single protein molecules, eliminating the need to crystallize them first, often a major hurdle for protein crystallographers. But generating enough diffraction data from a single molecule will require new x-ray sources billions of times brighter than today's.
U.S. Science Advocate George Brown Dies
- David Malakoff
Scientists have lost one of their leading advocates in Congress. Representative George E. Brown Jr. (D-CA), the oldest member of the House of Representatives and a leader of its Science Committee, died 15 July of an infection following open heart surgery. The physicist-turned-politician was 79.
Brown studied physics and engineering at the University of California, Los Angeles, in the 1940s and entered politics in 1954, winning a congressional seat in 1962 which he held ever since, except for a 2-year hiatus after losing a Senate race in 1970. He joined the House Science Committee in 1965, rising to chair in 1990. After Republicans won control of the House in 1994, Brown became the committee's senior Democrat. That post is now expected to pass to Representative Ralph Hall (D-TX), a Science Committee veteran and a former chair of its space science subcommittee. Observers say Hall's ascension is not likely to change the committee's direction.
One of the few House members with scientific training, Brown was an outspoken and often wry advocate for government spending on basic research and a booster of crewed and uncrewed space exploration. He was also a force behind the 1976 strengthening of the White House science adviser's office and the 1972 creation of Congress's Office of Technology Assessment, which the Republican leadership disbanded in 1995.
Brown was Congress's “wise man of science,” says Rita Colwell, head of the National Science Foundation. “Even after sitting through hundreds of presentations by researchers, George never lost a genuine delight in hearing of new breakthroughs,” recalls Representative F. James Sensenbrenner Jr. (R-WI), the current chair of the Science Committee. D. Allan Bromley, dean of engineering at Yale University and a science adviser to several Republican presidents, says Brown “will be very much missed.”
- DATA DISCLOSURE
Congress Votes Down Delay in Access Law
- Jocelyn Kaiser
Congress last week rejected a proposal to overturn a controversial new law that would force the release of scientific data from federally funded research. Universities and other scientific groups concerned about the impact of the legislation are now shifting their focus to a statement due out shortly from the White House Office of Management and Budget (OMB) on how it plans to implement the law, which is expected to go into effect this fall.
The law requires “all data” funded by federal grants to be subject to the Freedom of Information Act (FOIA), which gives the public access to government documents. It was tucked into last fall's omnibus appropriations bill by Senator Richard Shelby (R-AL), who argued that the raw data underlying regulations—such as recent new air pollution rules—should be publicly available. Scientific and university organizations have weighed in heavily against it, saying the new law would harass researchers, violate confidentiality agreements, and hinder the conduct of science (Science, 12 February, p. 914; see also Policy Forum on p. 535).
Legislators who had hoped to block the law suffered a major blow on 13 July, when the House Appropriations Committee voted 33 to 25 to reject an amendment to a bill funding OMB that would have delayed implementing the legislation for 1 year pending a study. The amendment was sponsored by James Walsh (R-NY) and David Price (D-NC). Two days later, National Institutes of Health director Harold Varmus and National Academy of Sciences president Bruce Alberts testified before another House panel on another bill to repeal the law. “We should go back to ground zero and ask, ‘What is it we're trying to solve?’” Varmus said about the bill, sponsored by Representative George Brown (D-CA), who died last week (see p. 509). But with the defeat of the Walsh-Price amendment, supporters of Brown's bill say its chances of passage appear slim.
The vote shifted attention to OMB, which is expected to issue within a few days a second version of a proposal released in February that drew more than 9000 comments. An unofficial copy circulating in Washington has eased the concerns of some who felt FOIA's exemptions for intellectual property rights and medical privacy weren't sufficient and that experimental results might be released before they had even been published. For example, the draft OMB document defines “data” as “any raw underlying information necessary to validate [research] findings, but not information that would violate the privacy rights of research subjects or the intellectual property rights of researchers.” The draft also restricts the law's reach to data “published in a peer-reviewed journal” or when cited “in a proposed rule.” Says Nils Hasselmo, president of the Association of American Universities: “It does address some of the critical issues that the scientific community had raised.”
It was unclear as Science went to press whether OMB will tinker further with this version before publishing it. Shelby's staff declined to comment until it appears in the Federal Register. Louis Renjel of the U.S. Chamber of Commerce—a strong supporter of the law—said the chamber believes OMB should apply the law not just to major rules but also to things like risk assessments. Right now, the draft defines “rule” as “an agency statement … intend[ed] to have the force and effect of law … designed to implement, interpret or prescribe law or policy. …”
Whatever it decides, OMB faces a tight schedule. The agency will allow for another 30-day comment period before issuing a final rule by 30 September.
String Theorists Find a Rosetta Stone
- Gary Taubes
Black holes—theoretical versions of them—have become crucibles for forging a theory joining the macroworld of gravity with the microworld of quantum mechanics
There is a belief promulgated by Dante, Joseph Campbell, and the makers of major motion pictures that to get out of a bad situation you must pass through the depths of the abyss. Theoretical physicists have lately taken up this philosophy, although the hell through which they must travel is the guts of black holes—not the kind in the universe at large, but what physicists often refer to, pace Einstein and his thought experiments, as gedanken black holes (or, in the words of Princeton University theorist Curt Callan, “gedanken black holes to the max”).
These hypothetical objects resemble elementary particles more than anything else, and, if real, would be smaller than a hundredth of a quadrillionth of a quadrillionth of a centimeter across. Nonetheless, they have lately taken on the leading role in string theory, physicists' most recent attempt to create a “theory of everything” that unites the forces operating on the microscopic scale of quantum mechanics with the large-scale force of gravity. Gedanken black holes have become, in effect, a Rosetta stone, says Andrew Strominger of Harvard University. In the physics of these hypothetical objects, the same phenomena can be found written in the languages of both quantum field theory and general relativity, Einstein's theory of gravity. “These are the two great achievements of 20th century physics,” says Strominger, “and for the first time we're seeing, at least in some cases, that they are really two sides of the same coin.”
If this latest string theory revolution turns out to describe the universe we live in—an enormous if—it will give physicists an unprecedented tool with which to finally develop a quantum theory of gravity. It will allow them to interpret the force of gravity not just according to the rules of general relativity—as the curvature of space-time caused by the presence of matter—but as the result of quantum mechanical fluctuations of the infinitesimal strings out of which, says the theory, all matter is composed. Whether or not the latest work leads to a working theory of everything, it is already responsible for a paradigm shift in how string theorists think about gravity, and it seems poised to provide solutions to some of the most perplexing paradoxes in the field.
The pursuit of a theory of everything and a viable quantum theory of gravity is predicated on a simple fact: Extrapolations from experimental data imply that at a scale of energy known as the Planck scale—some 18 orders of magnitude beyond what even the most powerful particle accelerators can generate—the gravitational force of the universe at large and the two forces of the microscopic universe, known as the electroweak and the strong force, would be equally strong and potentially indistinguishable. A theory that describes this unification, casting gravity in the same terms as the electroweak and strong forces, “is what we really need for a complete description of nature,” Strominger says.
Reasons to discount string theory as a candidate have always been easy to come by. The theory postulates that the universe is made from tiny, vibrating, stringlike particles, which can be closed loops like rubber bands or open-ended like bits of twine, and multidimensional membranes. Their different modes of vibration, akin to the harmonics on a violin string, would correspond to the different particles and forces in the universe. But this conception is supported by no deep theoretical or geometric insights—it's simply what the equations of the theory happen to describe. Edward Witten of the Institute for Advanced Study in Princeton, New Jersey, the field's impresario, calls the lack of any fundamental principles underlying string theory “the big mystery.” Nevertheless, he is sure such principles must exist. “Just as general relativity is based on Einstein's concepts of geometry,” he says, “string theory is based on deeper geometric ideas that we haven't yet understood. One facet of this fact is that we can't write down the succinct fundamental equations from which everything else should follow. We've discovered all kinds of equations but not the most fundamental ones.”
To complicate matters further, the theory exists in 10 or 11 dimensions, six or seven of which are compactified, as physicists call it, in the universe we live in. They are curled up tightly in such infinitesimal spaces that they go unnoticed, leaving four dimensions—three space, one time. All this counterintuitive weirdness aside, the biggest barrier to the acceptance of string theory as a theory of everything is that it so far provides no compelling predictions that can be tested by experiment. The problem is those 18 orders of magnitude to the Planck scale unification: “The physics is still extremely remote,” says Lenny Susskind of Stanford University.
Despite all its drawbacks, however, string theory has been the subject of thousands of papers since 1984, when it emerged as a potential theory of everything, and over 400 physicists have registered for the latest meeting in Potsdam, Germany, this month. Universities, once hesitant to hire string theorists, have been competing vigorously to get them on campus. Harvard, for example, lured Strominger away from the University of California (UC), Santa Barbara, while Stanford snatched Steve Shenker from Rutgers University, and Princeton snagged Eric and Hermann Verlinde, twins who had been tenured at separate institutions in the Netherlands.
“It is astounding and probably unprecedented that there would be that level of activity for that long in an area which so far has absolutely no tie to experiment,” says University of Chicago theorist Jeff Harvey. “The reason we keep on with it is that it seems to lead to new physical insights and beautiful things, wonderful structures. While that may not be proof, it's sufficiently convincing that there's either something to it, or it's got all the best minds in particle theory completely hornswoggled.”
Without experiment to guide them, string theory practitioners engage in the theoretical equivalent, testing what happens to their equations when they push the relevant parameters to their limits—when they make the forces between strings extremely strong, for instance, or when they add more or less symmetry to their equations. The more symmetry they add, the more constraints they put on the problems, making them easier to solve, if less realistic. String theorists talk about tweaking these parameters as though they're playing with the dials on a stereo to see what happens to the music. “Much of the progress we make,” says UC Santa Barbara theorist Joe Polchinski, “just comes from taking things we know and trying to push them further, or by looking for puzzles where we can't understand what the physics does when we vary parameters. In that sense it is almost like experimental physics: We don't really know what the theory is. We know a lot of things about it, and we're accumulating facts and trying to put them together into a theory.”
The work then proceeds in a manner unique to science. Because practitioners publish their work electronically, through the e-print archives at the Los Alamos National Laboratory in New Mexico, the entire community can read a paper hours after its authors finish typing the last footnote. As a result, no one theorist or even a collaboration does definitive work. Instead, the field progresses like a jazz performance: A few theorists develop a theme, which others quickly take up and elaborate. By the time it's fully developed, a few dozen physicists, working anywhere from Princeton to Bombay to the beaches of Santa Barbara, may have played important parts.
Since 1996, the quantum mechanical properties of black holes have been the dominant theme in this performance, and the field has been playing it with a passion. “Everybody is working on it in one way or another,” says Strominger—evidence of either the power of the approach or what Princeton theorist Igor Klebanov calls “a certain amount of herd mentality in the field.”
The second revolution continues
The latest series of breakthroughs, assuming they pan out, constitutes the second half of the second revolution in string theory. The first revolution came in 1984, when theorists realized that the theory could conceivably account for all the particles and forces in the universe. The second began a decade later and led directly to the latest progress. Until 1995, string theorists could study the behavior of their equations only in the simplest possible cases, consisting of a few elementary strings with very weak interactions between them. That wasn't adequate for testing the behavior of strings when they interact strongly, under conditions like those in the atomic nucleus or the innards of a black hole. “With strong interactions, where everything is pulling strongly on everything else, you can't use such a simple approximation,” says Polchinski. “We had no good tools for understanding what's happening.”
In 1995, however, string theorists discovered a wealth of what they call “dualities.” These were pairs of equations that allowed them to understand what happens when strings interact strongly, by working instead on “dual” formulations of the relevant equations at weak interactions (Science, 15 September 1995, p. 1511). “With these dualities,” says Polchinski, “we had a whole new, remarkable set of tools to map out how physics changes when interactions become strong. To make an analogy to water, it was like prior to 1995, we knew about steam but nothing else. Now we know about steam and water and ice, and how they change when you change the parameters. It was that kind of conceptual leap.”
With their new understanding, the string theorists also found that their equations seemed to describe a slew of potentially fundamental particles—“not just strings,” says Harvey, “but membranes or blobs or higher dimensional widgets.” It would take another year to figure out what role these objects played in the theory. In September 1995, Polchinski settled the issue, providing the springboard for the latest progress. Klebanov calls it “the lightning bolt” and says, “all we've been doing since then is milking various applications of his idea.”
Polchinski realized that the astonishing new multidimensional objects were all variations on D-branes—objects that he and two students, Jin Dai and Rob Leigh, had identified in 1989 without recognizing their full importance. Some D-branes are stringlike and one-dimensional, while others are surfaces in two, three, or more dimensions. Polchinski, Dai, and Leigh defined them simply as the surfaces on which open strings could end, just as a table leg ends at a table.
Polchinski used dualities to reexamine his D-branes and came up with a comprehensive set of rules for calculating the quantum dynamics of these new objects and understanding their role in the string theory universe. Among the more remarkable properties of Polchinski's D-branes was that their electromagnetic repulsion and their gravitational attraction canceled each other out. As a result—at least on paper and in the imaginations of theorists—they could be stacked on top of each other to create massive objects: objects, as Callan says, “that can be as heavy as you like.” For instance, you could wrap one-dimensional stringlike D-branes around one of the tiny compactified dimensions of the string theory universe, or you could wrap multidimensional D-branes around multidimensional compactified spaces, then add more D-branes, inexorably piling on mass. “You make these little Tinkertoy constructions of these wrapped D-branes,” says Harvey, “and if you do it in the right way, you get something which at large distances is indistinguishable from a black hole.”
In fact, such a Tinkertoy construction, if it existed, would manifest all the properties of a black hole as defined by the rules of general relativity, even though it was constructed purely from the stuff of string theory. It would be so massive as to trap light within its gravitational field; it would have an event horizon, beyond which no light or anything else can escape. And, most critical to what would follow, it would also have a temperature and an entropy, which is a thermodynamic concept that can be thought of as the amount of disorder or randomness in a system. Entropy can also be thought of as the number of different ways you can generate the energy of a system from the combined energy of all its microscopic constituents (atoms, for instance, or molecules—or strings, or even D-branes). Entropy is usually lowest when the temperature of the system is precisely absolute zero. As the temperature rises, the number of different possible states of the system that can generate the energy increases, as does the entropy.
In the mid-1970s, Cambridge University physicists Stephen Hawking and Jacob Bekenstein, now of the Hebrew University of Jerusalem, had demonstrated that the familiar kind of black holes, described by general relativity, must have both a temperature and an entropy, and that they obey a set of laws equivalent to the laws of thermodynamics in gases. In gases, as Ludwig Boltzmann had shown in the 19th century, entropy could be derived by counting all the microscopic configurations that molecules in the gas could adopt, which physicists call the microstates of the system. So Hawking and Bekenstein's result implied that a black hole's entropy could be calculated not just by its description from general relativity, known as the Bekenstein-Hawking formula, but by counting microstates. And that, in turn, strongly implied that black holes had a microscopic description, making them a potential bridge between the macro- and the microworld.
Such a description was beyond any theories of the time. To be meaningful, it would have to satisfy three constraints, says Strominger. “One, it had to include quantum mechanics; two, obviously, it had to include gravity, because black holes are the quintessential gravitational objects. And, three, it had to be a theory in which we're able to do the difficult computations of strong interactions, because the forces inside black holes are large. String theory has these first two features: It includes quantum mechanics and gravity. But until 1995, the kinds of things we could calculate were pretty limited.”
Polchinski's work on D-branes provided the tools to do the calculations. If a black hole consisted of D-branes stacked together, physicists might be able to convert its microscopic properties into its entropy. Strominger and Harvard theorist Cumrun Vafa looked for a theoretical black hole that they could build out of D-branes, then find its entropy by counting microstates. “There are all kinds of black holes with different numbers of dimensions and different charges and so on,” Strominger says. “What we discovered was a black hole in five dimensions. We ended up with this one because it was the only one we could map to a problem we could solve.”
To be precise, Strominger and Vafa took the 10 dimensions of the string theory universe and compactified them down to five. Then they wrapped five-dimensional and one-dimensional D-branes around the compactified dimensions, ending up with what Strominger calls a “rather complicated bound state of D-branes, contorted and twisted, wrapping around the internal dimensions.” Because D-branes are defined as the surfaces on which strings end, strings were also stuck to the D-branes, and these strings, like the D-branes, had excitations running around them. Strominger and Vafa used statistical techniques to count all the possible quantum states of this tangle of strings and D-branes, giving them one measure of the entropy. Then they applied the Bekenstein-Hawking formula, based on general relativity, to find the other. The two agreed exactly.
Impressive as that agreement was, the work still generated skepticism. For all its twists and contortions, the black hole that Strominger and Vafa had constructed, called an extremal black hole, is the simplest of gedanken black holes, and its simplicity made it a questionable example. Black holes slowly evaporate through a process known as Hawking radiation. Unlike other black holes, however, extremal black holes carry an electromagnetic charge; as they evaporate, the electromagnetic force eventually cancels out the evaporation and halts the process. That makes extremal black holes relatively easy to work with, says Strominger, “because they're not changing. They're just sitting there.”
Theorists worried that what works for tidy extremal black holes might not work for more complicated and more interesting black holes—the “gray bodies,” for instance, that are still emitting Hawking radiation. In the 18 months that followed, says Strominger, various teams tried “to build a more precise dictionary relating these two descriptions of black holes.” First, Callan and Juan Maldacena, who was then his graduate student, and independently, Strominger and Gary Horowitz of UC Santa Barbara, constructed and calculated the entropy for what Callan calls “the next more complicated” black hole. They imagined what would happen if their hypothetical tangle of D-branes and strings had vibrations traveling in opposite directions. These waves could collide, annihilate each other, and emit a suitably stringlike particle that would be free to escape the system, taking energy with it. “That coincides to a temperature” that can be calculated, Callan explains. “And then you can re-express it in terms of the properties of the general relativity black hole that this thing is modeling. And, son of a gun, it gives you exactly the Hawking formulas; you get the right Hawking temperature and the right Hawking radiation rate. These things match beautifully.”
Then Klebanov and Steven Gubser of Princeton, and Amanda Peet, now at UC Santa Barbara, tried to do a similar calculation with four-dimensional D-branes, rather than the tangle of five- and one-dimensional ones. After all, four dimensions—three spatial and one temporal—“is almost our world,” says Klebanov. “We did it, and it seemed to almost work.” Next, Samir Mathur of the Massachusetts Institute of Technology and Sumit Das of the Tata Institute of Fundamental Research in Bombay, India, calculated how quickly a black hole with some temperature would cool down to its ground state at absolute zero. This time, the two descriptions agreed perfectly.
The next paper, says Callan, was “even more amazing.” Strominger and Maldacena teamed up to calculate the dual descriptions of the total energy spectrum of the radiation emitted by a black hole, which is shaped by its gravity as well as temperature. When they calculated this spectrum from the equations of string theory, they got what Callan describes as “some crazy function”—which happened to agree exactly with the result given by general relativity. “Bingo,” says Callan. “This is telling you something really uncanny is going on.”
By this time, Maldacena, who had moved to Harvard, had begun to put his finger on that uncanny something, the reason why these various descriptions of the black hole and its behavior agreed so well. The quantum theory that described the excitations on D-branes and strings, he speculated, was not only describing the quantum states inside the black hole, “but somehow also the geometry of the black hole close to its event horizon”—that is, close to the edge of the black hole. He formulated these thoughts into what string theorists now call Maldacena's conjecture, which states, for certain cases, that a quantum theory with gravity and strings in a given space is completely equivalent to an ordinary quantum system without gravity that lives on the boundary of that space. The conjecture represents the zenith—so far, at least—of revolution number two.
Maldacena took the two ways of looking at a black hole within string theory (one as the quantum mechanical tangle of D-branes and the other as the massive object described by general relativity) and studied what happens to them when the temperature of the black hole approaches absolute zero. On the D-brane side, as energy—and hence mass—dwindles at this low temperature limit, the gravitational force goes to zero, as do the interactions among the strings and the D-branes. What's left is a simple species of quantum field theory called a gauge theory, which is familiar to physicists because, among other reasons, it describes the electroweak and strong forces.
On the general relativity side of the equations, the result of lowering the temperature was even more surprising, says Strominger: “You start out with a universe with a black hole and strings in it, and then you lower the temperature. As you do that, the space-time of the universe literally gets frozen. Nothing can happen in it except for very, very near the event horizon of the black hole, where there will always be a region hot enough to allow strings to move around freely. The lower the temperature in the space-time, the closer to the horizon that region is forced to be.” The space-time of the wider universe is a complex mixture of flat and curved regions, but in this sliver of a region close to the event horizon it becomes more uniform and symmetrical. The simpler geometry simplifies the string theory that lives in it. “At the end of the day, what's left is a quantum theory of gravity [that] is considerably simpler than the theory we started out with,” says Strominger.
This was, in effect, a first glimpse of the ultimate (so far) duality: The complicated D-brane gamische that made up the black hole had reduced to a simple field theory without even strings in it; the general relativity description of the black hole had reduced to a simple quantum theory of gravity. “Before this, it was thought that gravity theories are inherently different from field theories,” says Maldacena. “Now we could say that the gravity theory is the same as the field theory.”
Everybody in the field seems to interpret this duality a little differently, depending on their own mental images of the universe. Jeff Harvey, for instance, understands it to mean that instead of having two different ways to compute what's going on with the black hole, you now need only one—the microscopic string way. “Maldacena's conjecture says that all the things of ordinary gravity are somehow contained within what happens on these D-branes. It is almost as though gravity is some kind of residual field left over when you treat these other gauge theories in the right way.” To Polchinski, the conjecture suggests that D-branes are somehow the atoms from which black holes are made, and gravity is just the combined effect of all these excited strings and D-branes furiously undergoing quantum fluctuations.
The conjecture also had potent implications extending well beyond black holes. (For one of them, known as holography, see sidebar on p. 515.) In particular, Maldacena's most precise formulation of the conjecture linked the simple string theory near the horizon of a black hole to a particular gauge theory known as a large N gauge theory, which had mystified physicists for 25 years. Gerhard t'Hooft of the University of Utrecht in the Netherlands had worked on these theories in the 1970s to understand the behavior of the strong force of the atomic nucleus, which theorists can only precisely calculate at very high energies or temperatures. (N stands for the number of colors in the theory; colors are one of the two strong-force equivalents of electromagnetic charge.) He also suggested that such large N gauge theories might in fact be string theories, because when physicists drew diagrams describing the interaction of elementary particles in these gauge theories, the diagrams looked a lot like the interactions of strings. t'Hooft did this work well before string theory emerged as a potential theory of everything. “The fantasy that gauge theory might really be a string theory has kicked around for a long time,” says Stanford's Steve Shenker.
The catch was, no one had made much progress on the large N gauge theories, either. Now, through the circuitous route of black holes and string theory—“a minor miracle,” says Klebanov—Maldacena's conjecture suggested a connection between this four-dimensional, large N gauge theory and a simple string theory that physicists knew how to solve. By making the large N theory tractable, the connection may even be a route to understanding the strong interactions.
Two papers followed Maldacena's, one by Witten and one by Princeton collaborators Klebanov, Gubser, and Alexander Polyakov, who had done crucial work on the four-dimensional gauge theories. These provided specific recipes for what theorists can and cannot calculate when they use a string theory to understand its dual gauge theory. The two papers, says Callan, “very specifically showed what's the rule, what's the recipe, how do you make this connection. And people have been elaborating on it ever since.”
So although Maldacena's conjecture has yet to yield any further profound understanding of string theory, string theory is allowing physicists to make progress on theoretical questions that lie much closer to the real world. Whether Revolution Number Two will yield a theory of the universe we live in is still a wide-open question, however. As Strominger says, the work has produced everything a theoretical physicist could want, “except for an experimentally verifiable prediction. It gives us a very precise and explicit relationship between seemingly disparate fields of investigation, a relationship that we can use in some cases to solve problems we've wanted to solve for decades. And it has suggested new ideas that we've previously lacked the imagination to think about.”
Once again, string theorists have made enormous progress, but if you ask them where that progress is leading them, they'll still admit that they have no idea. “Part of the goal,” says Harvey, “is still to figure out what the hell it all has to do with reality.”
The Holographic Universe
- Gary Taubes
Black holes have turned out to be fertile ground for extending string theory and demonstrating its connections to known physics (see main text). Along the way, string theorists may have solved a puzzle posed by black holes, which has troubled theorists for decades: the question of what happens to information that falls into a black hole. String theory, combined with earlier theoretical work, implies that the information swallowed up by the black hole is somehow expressed on its boundary, just as a three-dimensional object can be captured in the two dimensions of a hologram.
Basic principles of physics teach that information in the universe is preserved: If you had perfect knowledge of the present, you could, in theory at least, reconstruct the past and predict the future. (Such perfect knowledge is impossible in practice, of course.) Suppose you threw an encyclopedia into a fire, for example; if you had perfect knowledge of the radiation emitted and the ensuing motions of all the atoms and molecules, you could, with infinite attention to the details, reconstruct the knowledge inside the encyclopedia. Physicists refer to their equations as “unitary”—that is, they preserve information.
But once information falls through the horizon (or edge) of a black hole, it can never get back out. This would not be a problem if the information simply remained inside the black hole, but, as Stephen Hawking demonstrated in the 1970s, black holes evaporate through a process called Hawking radiation. Pairs of particles are created outside the hole, one with positive energy and one with negative energy. The negative one drops into the black hole, lowering its mass, while the positive one comes out as radiation that is independent of what was inside the black hole. “Hawking claims that when the black hole disappears, the information is irretrievably lost,” explains Andrew Strominger of Harvard University. “This was a very shocking statement because it bears on our most fundamental, cherished beliefs about the laws of physics.”
Although no one has found a flaw in Hawking's argument, few physicists believe that black holes should behave in a way contrary to everything else in the universe. Over the last 3 years, however, string theorists may have finally found a way around the problem, by providing support for a concept called holography, proposed in the mid-1990s by Lenny Susskind of Stanford University and Gerhard t'Hooft of the University of Utrecht in the Netherlands, with contributions from other theorists, such as Kip Thorne of the California Institute of Technology in Pasadena and Jacob Bekenstein of the Hebrew University of Jerusalem. Although the concept is surprisingly simple—it says that a theory within a region of space-time is equivalent to a theory on the boundary of that region—it “was considered absolutely off the wall by all general relativists and basically all string theorists,” says Susskind.
Last year a thoroughly holographic conjecture by string theorist Juan Maldacena of Harvard University won over some of the skeptics. Maldacena's conjecture says that a string theory, describing both gravity and microscopic quantum interactions, in a given space is equivalent to an ordinary quantum system without gravity that lives on the boundary of that space. Applied to black holes, Maldacena's conjecture implies that the theory describing the nature of a black hole's interior is equivalent to a conventional quantum field theory describing the boundary of the black hole—the kind of unitary theory in which information is conserved. And because Hawking radiation is emitted from the boundary, the conjecture implies that it carries the information that would otherwise disappear as the black hole evaporates.
“This string theory episode has completely cut around the back of the barn,” says Princeton University theorist Curt Callan. “It suffices to say that the thing at the bottom of the black hole is constructed out of strings, or D-branes,” another fundamental object in string theory. “And you can use perfectly unitary rules for their interaction, to compute stuff that looks for all the world like Hawking radiation. So the combined system of what's outside and what's inside looks like a perfectly standard quantum mechanical system.”
The implications of holography and Maldacena's conjecture could go well beyond the black hole paradox and lead to what Edward Witten of the Institute for Advanced Study in Princeton, New Jersey, calls “a real conceptual change in our thinking about gravity.” They imply that what happens inside the black hole is somehow transferred a macroscopic distance to the surface, says Samir Mathur, a theorist at the Massachusetts Institute of Technology. “We have not been able to see how the conjecture or holography actually does that,” he says. “That is the crucial point, because the moment we understand that, I think we will learn something very big about the nature of gravity.”
- EVOLUTION '99 MEETING
Development Shapes Evolution
- Elizabeth Pennisi
MADISON, WISCONSIN—Once rarely considered in evolution discussions, development was a hot topic for researchers here for the Evolution '99 meeting, held from 22 to 26 June.
Why Most Mammals Are Neck And Neck
Bats and giraffes look nothing alike and live very different lives. Despite that, they and most other mammals have exactly the same number of neck vertebrae: seven. Generations of biologists have puzzled over this strange consistency. Many other skeletal components do vary according to the needs of the individual organism, and the number of neck vertebrae can vary dramatically in other animals, such as birds. A swan, for example has about 25, while swifts have just 13. Now, Frietson Galis, a functional morphologist at Leiden University in the Netherlands, may have ferreted out the reason why mammals have maintained a seven-vertebra neck over millions of years of evolution.
Based on her searches of the scientific literature, she proposes that a serious developmental constraint limits change in mammalian neck vertebrae. As Galis reported at the meeting, any alteration of the genetic program responsible for generating those seven chunks of backbone greatly increases the risk for embryonic cancers—those arising because embryonic tissue continues to proliferate instead of specializing into a particular organ. The constraint, she argues, operates in mammals but not in other, less cancer-prone animals. Gunter Wagner, a developmental biologist at Yale University in New Haven, Connecticut, says the proposal is “the first rational explanation for a phenomenon that has never really made sense.” (The work also appears in the spring issue of Molecular and Developmental Evolution.)
Galis first began to consider the cancer connection after a colleague who studies neuroblastomas, a type of brain tumor that arises from embryonic neural tissue and usually occurs in children, mentioned that many neuroblastoma patients also have congenital rib abnormalities. And when she searched the medical literature, she discovered that others, including R. Schumacher of the University Children's Hospital in Mainz, Germany, and Steven Narod of the University of Toronto, had also noted an association between skeletal abnormalities and childhood cancers.
Galis then learned that researchers had found that mouse strains tend to develop abnormal ribs when one or another of the Hox genes, which help set up the organization of the backbone and other parts of the embryo, is inactivated. And other studies in mice had linked the inactivation of Hox genes or their regulators to cancer. For example, some found increased rates of leukemia and related cancers when the Hox gene regulator M11 was inactivated or when the Hoxb-8 gene was overactive; others found an increase in intestinal cancer when another Hox regulator, Cdx2, was missing. Taken together, these findings are making it “much clearer that Hox genes can serve a dual function,” helping promote cell growth as well as tissue organization, says Narod.
Consequently, Galis says, mutations in the genes would be highly deleterious, especially if combined with other mutations that lead to cancer. As a result, Hox gene changes affecting neck vertebrae number are unlikely to persist, even if they make a neck better suited to a particular organism's way of life, Galis concludes.
Reptiles aren't subject to this constraint, she proposes, because they have lower metabolic rates than mammals and are thus less likely to produce highly reactive oxygen free radicals that can damage the DNA and produce mutations that, in conjunction with altered Hox gene activity, lead to cancer. And neither are sloths and manatees, which also have lazy metabolisms—and six to nine neck vertebrae. That leaves birds, which also have high metabolic rates, but vary their neck vertebrae as freely as reptiles.
Apparently, high metabolism in birds doesn't exact the same mutational cost, Galis learned. Studies show that pigeon and canary cells do not generate the large numbers of reactive oxygen molecules that mammalian cells do. Perhaps as a result, the cancer rate in birds is about half that in mammals, and most of the cancers that do develop are caused by viruses. “The difference is really striking,” Galis says.
Galis will have a difficult time proving that cancer risk limits the number of mammalian cervical vertebrae to seven. Even so, notes evo-devo biologist Jessica Bolker of the University of New Hampshire, Durham, “she's done a good job of pulling together some really diverse kinds of data to come up with a really convincing hypothesis.”
Sex and the Single Cockroach
Most of us don't even want to think about cockroaches that can clone themselves. But evolutionary biologist Allen Moore of the University of Manchester in the United Kingdom and his colleagues cherish one such creature, the African cockroach Nauphoeta cinerea, as an opportunity to learn more about why so few organisms reproduce asexually, even when they could make do without a mate.
As Laura Corley, a former graduate student in the Moore lab, reported at the meeting, the roach shows that asexual reproduction is a hard road. Working with Moore and his wife, developmental biologist Patricia Moore, when the team was at the University of Kentucky, Lexington, Corley found that only a few N. cinerea females—those with the most varied genetic makeup—are able to reproduce without sex, and they generate few offspring.
Quite a few sexual species, particularly insects, can sometimes switch to parthenogenesis, in which their unfertilized eggs develop into new individuals. Evolution should favor the process, because it allows an individual to pass all of its genes to each progeny, instead of the 50% they transmit by sexual reproduction (Science, 25 September 1998, p. 1980). But even in insects that can make the switch, asexual reproduction is rare—posing a long-standing puzzle in biology.
One reason, Corley and her colleagues found in N. cinerea, may be that only a few individuals within a species are capable of parthenogenesis. They first separated out immature female cockroaches from their lab colonies to prevent them from mating. Only 14% to 44% of these virgin females produced young. The researchers then looked at the genetic makeup of the females that could not reproduce on their own and also that of the females that could and their offspring.
They focused on six genes that they found to be polymorphic, meaning that they exist in several different versions. The researchers found that the two copies of each gene were much more likely to be different in females able to reproduce parthenogenetically than in other N. cinerea females. The single moms “were a very specific subset of the population,” says Corley, who is now at the University of Wisconsin, Madison. “It suggests that it's good to be genetically variable; it allows those individuals to be more flexible.”
The researchers don't know why that is, although they speculate that, as in so many other species, individuals with a varied genetic makeup are less likely to suffer from genetic defects that reduce their vigor. After all, as Corley found, parthenogenesis is a struggle even for roaches that can pull it off. The females reared alone produced no live young from their first two batches of eggs, and when they finally did give birth, had just six or so progeny over their lifetimes. In contrast, females reproducing sexually have well over 100 lifetime offspring. “The difference in fitness is dramatic,” Corley notes.
Rudolf Raff, a developmental biologist at Indiana University, Bloomington, attributes the poor asexual reproduction to the reproductive machinery being “very creaky.” It's set up to produce gametes with half the genome, he notes, and thus only rarely can individuals generate eggs with the full number of chromosomes. Thus, once sex evolves in an organism, going back to asexual reproduction becomes very difficult.
Neither Moore nor Corley know just why, but having both sexual and asexual behavior within a species should make the problem easier to study, Corley notes. Raff agrees: “[The cockroaches] could be quite useful for further studying this phenomenon.”
- U.S.-CHINA TIES
Biomedical Group Lobbies NIH
- Eliot Marshall
A delegation of Chinese scientists, some working in the United States, is urging NIH to boost its funding of research in China
LEXINGTON, MASSACHUSETTS—Adopting U.S.-style lobbying techniques, Chinese biomedical researchers are pressing the U.S. National Institutes of Health (NIH) to support a skein of collaborative projects aimed at advancing science and mending frayed relations between the two countries.
Last week a delegation of scientists from the United States and China met with top NIH officials in Bethesda, Maryland, to discuss such ideas as high-tech methods of analyzing traditional Chinese herbal medicines, research on human genetic variation, a scheme to create a genetic knockout mouse production center in China, and AIDS vaccine trials. The ideas were developed at an extraordinary gathering here the previous weekend, where more than a score of Chinese researchers met U.S. counterparts, along with NIH head Harold Varmus, to discuss research plans, renew contacts with expatriates, and make new connections. “Biomedical science can break the ice” produced by “an early winter between our two countries” brought on by recent allegations that Chinese scientists helped steal U.S. nuclear secrets, says Yiming Shao, an AIDS researcher from the Chinese Academy of Preventive Medicine, who attended both meetings.
The delegation, headed by deputy minister of public health Peng Yu, departed from NIH with no specific commitments from the Americans. But Varmus offered some encouragement when he said that science can bridge political differences. And Gerald Keusch, director of NIH's Fogarty International Center and organizer of the group's visit to Bethesda, described the 14 July talks as a “first step” that may lead to “concrete” agreements. “The way we left things,” he says, “is that they would do some thinking about their highest priorities and we would think about ours.”
Prospects for collaboration on genetic studies are already good, aided by new guidelines governing the export of genetic material, which the Chinese government adopted last fall (Science, 18 September 1998, p. 1779). Ming Tsuang, a psychiatric geneticist at Harvard University, announced at the Lexington meeting that he has been approved to receive samples and that “the system is working well.” Since then at least one other transfer has been approved, according to Xiping Xu, a Harvard School of Public Health epidemiologist who, as president of the Boston, Massachusetts- based Association of Chinese Professionals in Biomedicine, helped organize the meeting.
To get a closer look at what China can bring to collaborative ventures, Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, is heading to Beijing in September, and Keusch will be joining a group of National Science Foundation officials going to China in October. Last week Varmus announced plans to go to Beijing in November to join other Nobel laureates in celebrating the 50th anniversary of the Chinese Academy of Sciences.
But Chinese scientists are eager to get projects moving sooner. They tried to persuade Varmus during his visit here that NIH should invest more research dollars in China. Varmus listened politely but made no commitments. Keusch notes that NIH already backs several major collaborative projects in China, including a tropical diseases research center in Shanghai, a national study of cardiovascular disease, and AIDS prevention.
Although the mood in Lexington was optimistic, attendees acknowledged the tension between China and the United States. Cardiology researcher Jie Wang of Columbia University said he had been questioned by an FBI agent in February about his involvement with a group of Chinese scientists interested in drug development. He said he had ignored the incident on the advice of his university, but he asked Varmus what a person should do if “harassed” by an FBI agent over participation in “meetings like this.”
Varmus responded forcefully: “Just let me know … I am ready to speak out.” The words pleased the Chinese scientists, who hope that scientific collaboration will also warm the climate between the two countries.