News this Week

Science  10 Jan 1997:
Vol. 275, Issue 5297, pp. 155

You are currently viewing the .

View Full Text

Log in to view the full text

Log in through your institution

Log in through your institution

  1. Molecular Biology

    Opening the Way to Gene Activity

    1. Elizabeth Pennisi

    A flurry of activity during the past 9 months has brought the chemical modification of histone proteins to the fore in the regulation of gene expression

    For decades, molecular biologists bent on understanding how the cell controls the activity of its genes rarely paid much attention to the genes' milieu. In the living cell, the DNA of the genetic material is tightly bound up with histones and other proteins, which together form the chromatin. Even though biologists suspected that the histone's grip had to be loosened before other proteins could turn genes on and off, chromatin is “so complex and so messy that there was very little attempt to really look at it,” recalls developmental geneticist John Lucchesi of Emory University in Atlanta. But results from several labs are now making it impossible for researchers to ignore chromatin structure and its modifications any longer. Indeed, says molecular biologist Kevin Struhl of Harvard Medical School, “The whole field has literally exploded in the past 9 months.”

    In that explosion, researchers have begun to identify parts of the machinery that alters chromatin structure so that gene-regulating and transcribing proteins can do their job. The findings all support the notion that a chemical reaction called acetylation, in which simple chemical groups known as acetyls are added to the histones, is important. The modified proteins then hold less tightly to the nearby DNA, opening the way to activating gene expression. The idea isn't new, but there was little direct evidence for it because researchers could not get their hands on the enzymes that add and remove the acetyl groups. All of that has changed now with the identification of four distinct nuclear histone acetylating enzymes and five more enzymes that undo the reaction by removing acetyls from histones.

    Besides pinning down these enzymes, the new work links them to gene expression and to changes in cell growth. All four acetylating enzymes have turned out to be proteins already known to associate with transcription factors, the proteins that regulate gene expression. And their activity has been intimately linked to the control of the cell cycle, the carefully choreographed changes in gene expression and other cellular activities that culminate in cell division. There are even indications that abnormal acetylation can lead to cancer development. “It looks like [acetylation] is a fundamental cell regulatory process,” says organic chemist Stuart Schreiber of the Howard Hughes Medical Institute at Harvard University, whose team discovered the first histone deacetylase.

    Anticipating acetylation

    Cell biologists first began to suspect that acetylation might help regulate gene expression in 1964, when Vincent Allfrey of the Rockefeller Institute (now Rockefeller University) in New York City discovered that histones are sometimes heavily acetylated. In the nucleus, the DNA is arranged in nucleosomes, beadlike structures consisting of a DNA strand wrapped around a histone core that are connected to one another by other histone molecules. And Allfrey noted that the chemical modification appeared to reduce how tightly the histones associate with the DNA in the nucleosomes.

    Allfrey's evidence also suggested that these changes influence gene activity. He found that unacetylated histones appeared to inhibit the transcription of DNA into RNA, the first step in gene expression, while histone acetylation reduced this inhibition. This led him to suggest that the acetyl groups, which become attached to the positively charged amino acid lysine, lessen histones' attractiveness to nearby DNA by neutralizing the proteins' positive charge. This change would then make it possible for the proteins needed for gene activity to get close enough to interact with the DNA. Conversely, Allfrey predicted, removal of the chemical side groups would close the door on gene transcription by restoring histone's tight connections with DNA.

    Since then, the link between the amount of acetylation of the histone cores and the rate of transcription has strengthened. But because no one could pin down any enzymes capable of adding or removing acetyl groups, researchers were unable to take the work any further. “A lot of people tried, we tried, but nobody had fingered a polypeptide that was responsible for the acetyltransferase activity,” says cell biologist C. David Allis of the University of Rochester in New York.

    The closest anyone came was in 1995, when Rolf Sternglanz from the State University of New York, Stony Brook, and shortly thereafter, Dan Gottschling, now at the Fred Hutchinson Cancer Research Center in Seattle, and their colleagues found an enzyme that adds acetyl groups to newly formed histones. But that reaction, which occurs even before the histones move into the nucleus, could not have anything to do with gene control.

    The nuclear acetylating enzymes that are more likely to do the job apparently are too scarce to show up in the standard protein-sorting methods. That same year, however, in what Allis describes as perhaps “a last-ditch effort,” he and James Brownell in his lab developed a new way to track down the elusive nuclear acetylating enzymes.

    Brownell first mixed histones into a gel used to sort proteins by size. After separating proteins in nuclear extracts on this modified gel, he then added radioactively labeled acetyl coenzyme A as a source of acetyl groups. He and Allis reasoned that the previously undetectable band of acetylating enzyme would transfer the labeled acetyl groups to the histones in the gel. The presence of these easily detectable, radiolabeled histones would then indicate their quarry's location in the gel.

    Because the ciliated protozoan Tetrahymena is known to have a lot of acetyl groups attached to its histones, Brownell and Allis used extracts of this organism's nuclei in their search. It turned out to be a good choice. Their radiolabeling technique produced a band of acetate-labeled histone, and by what Allis describes as “brute-force” methods they collected and purified enough of the protein located at that band site to determine a partial amino acid sequence. Jianxin Zhou in Allis's lab then synthesized the DNAs that could encode this amino acid sequence and used them as probes to find and clone the gene encoding the protein, which they named HAT A (for histone acetyltransferase type A).

    A search of existing gene databases for genes whose sequences resemble those of the HAT A gene yielded an unexpected but exciting result. As Allis's team reported in March 1996, the Tetrahymena HAT enzyme turned out to be very similar to a protein called Gcn5p (for general controlled nonrepressed protein) that had been discovered first in 1992 by a yeast geneticist, George Thireos, at the Institute of Molecular Biology and Biotechnology in Heraklion on the island of Crete, and later again by Leonard Guarante of the Massachusetts Institute of Technology (MIT) during a search for yeast genes that affect transcription. But beyond playing a role in transcription, “it was completely unclear what these genes [and their proteins did],” says Harvard's Struhl.

    The Allis team's discovery that their HAT A is very similar to Gcn5p raised the possibility, however, that Gcn5p is itself a histone acetyltransferase. And indeed, working with Sharon Roth at the M. D. Anderson Cancer Center in Houston, the researchers found that Gcn5p works in yeast much as HAT A does in Tetrahymena. “It was a somewhat startling result,” recalls Harvard yeast geneticist Fred Winston. “It brought two things together that hadn't been brought together before: something that was a transcription factor was [also] a histone modifier.” Adds Struhl: “It turned everything [regarding acetylation and transcription] from this vague correlation to specific molecules.”

    Acetylase lineup.

    Many transcription factors and other proteins interact to bring about the reading of DNA (seen here in white, wound around blue histone cores). Several of these, including p300/CBP, PCAF, and TAFII 230/250, are acetylase enzymes, which may add acetyl groups to histones at multiple sites.

    Y. Nakatani

    Mammalian HATs

    The emerging story would soon take an even more intriguing twist, when work already under way in molecular biologist Yoshiro Nakatani's lab at the National Institute of Child Health and Human Development (NICHD) led not only to the discovery of a mammalian counterpart to the yeast acetylating enzyme, but connected such enzymes to the cell's growth-control pathways. Nakatani was studying E1A, an oncogenic protein made by adenovirus that overrides the natural brakes on proliferation in mammalian cells by altering gene transcription. He knew that in order to exert its growth-stimulating effects, E1A has to bind to a large cellular protein, called p300/CBP. Typically, p300/CBP forms a complex with certain transcription factors, helping them promote specific patterns of gene expression. But exactly how E1A binding to p300/CBP releases cell growth controls is unclear. To get a better picture, Nakatani needed to track down p300's normal protein partners. He found one of them, which he called PCAF (for p300/CBP-associated factor), by looking for the mammalian counterpart of Gcn5p.

    At that point, the yeast protein was not yet known to be an acetyltransferase. But when Nakatani heard that Allis had shown that Gcn5p has acetylating activity, Xiang-Jiao Yang in his group tested PCAF for similar activity. It, too, was one of those elusive enzymes, he and his colleagues reported in July 1996. And it's not the only one in mammals. As Nakatani continued his work, he and NICHD's Bruce Howard and Vaily Ogryzko began to suspect that p300/CBP itself might be capable of acetylating histones because they could detect acetylation by the p300/CBP-E1A complex even in PCAF's absence. Their hunch proved correct, as the group described in November. Then, less than a month later, Andrew Bannister and Tony Kouzarides of the University of Cambridge in England reported the same result.

    The discovery that p300/CBP and PCAF acetylate histones indicates that the reaction is important for turning on a wide range of genes. Although Gcn5p apparently has a rather limited range of action, p300/CBP has “been shown essential for a lot of genes that are acutely regulated,” says Alan Wolffe, a biochemist at NICHD. And further evidence that acetylation is a common prerequisite for gene expression came just as 1996 drew to a close when Nakatani and Allis found that a protein called TAFII230/250, which is a part of a large transcription factor complex called TFIID, is also a histone-acetylating enzyme. TFIID is necessary for the initiation of transcription of all protein-coding genes, so the result further expands the role of acetylation in gene expression, Allis says.

    The HATs found so far have somewhat different specificities. For example, while p300/CBP can add acetyl groups to all four histones in the nucleosome core, Gcn5p acetylates only two of them—histones H3 and H4. And the different acetylating enzymes are also specific about which histone amino acids they will acetylate. These differences suggest that the added acetyl groups may be more than just “on” or “off” signals for transcription. They may also serve, Allis says, as “specific flags,” for attracting other proteins that can fine-tune gene activity. “Depending on which particular site [on the histone] gets acetylated, it might make a difference in downstream effects with other [molecular] machinery that needs to bind to chromatin,” he proposes.

    A newly discovered potential HAT in male fruit flies shows one possible kind of discriminating acetylation reaction: acting only on one chromosome. In Drosophila, the male's X chromosome makes up for the fact that it is present in only a single copy by producing as much gene product as do both copies of that same chromosome in the female. Emory's Lucchesi has identified five proteins that seem to trigger this dosage compensation, as it's called. One of them, he says, looks very much like a HAT. That HAT seems to acetylate a particular lysine on the H4 histones along the X chromosome but doesn't appear to touch histones on other chromosomes. “What we think is that we have a gene that encodes a specific HAT that is targeted to the X chromosome in males and that is responsible for a very well-defined enhancement of transcription,” Lucchesi says of MOF (for male-absent on first chromosome), the name of the gene.


    View this table:

    Cell-cycle regulation

    Anything that affects gene expression is likely to influence whether or not cells proliferate, and that goes for HATs as well. Most likely, proteins that normally interact with PCAF and p300/CBP keep the cell cycle in check, presumably by turning on genes that inhibit cell growth while turning off those that might foster cell division. But as Nakatani has shown, E1A binding to p300/CBP displaces PCAF from the complex, and this may alter HAT activity and thus gene acetylation patterns, allowing expression of genes that enable cells to start dividing and perhaps preventing the expression of genes that inhibit cell division.

    Indeed, changes in acetylation may play a role in the uncontrolled cell growth of cancer. In September, two reports—one from yeast geneticist Lorraine Pillus's team at the University of Colorado, Boulder, and the other from Julian Borrow of MIT—hinted at a connection between a certain type of acute myeloid leukemia and p300/CBP. The work links the leukemia to a chromosomal translocation that fuses the gene for the acetylating enzyme with another gene. “This translocation may be misdirecting [p300's HAT] activity,” says Pillus. As patterns of acetylation change, genes normally turned off may turn on, facilitating tumor growth.

    The regulation of the converse process—the removal of acetyl groups from histones—may be just as crucial to normal cell growth. Harvard's Schreiber discovered the first of the histone deacetylase enzymes responsible for this reaction as part of his efforts to understand how trapoxin, a compound that shows potential for treating cancer, inhibits cell growth and causes tumor cells to revert to their normal, differentiated states.

    Others had shown that trapoxin blocks the removal of acetyl groups from histones and in so doing may put the brakes on cell growth by increasing or restoring gene activity that normally helps keep cell division in check. By tracking a protein that trapoxin binds in cancer cells, Jack Taunton in Schreiber's group identified the enzyme that normally removes the acetyl groups from the histones.

    And December brought news that deacetylation, like acetylation, may be accomplished by multiple enzymes. Yeast geneticist Michael Grunstein from the University of California, Los Angeles, and his colleagues found that yeast contains five separate versions of Schreiber's deacetylase.

    Many researchers expect that even more acetylating and deacetylating enzymes await discovery. “It's put a whole new spin on everybody's work,” says Allis. “Now a lot of coactivators are being screened for acetyltransferase activity.” At the same time, the researchers who have already found HATs are sorting out the functions of their discoveries. Besides figuring out the specific jobs for each, they want to know how acetylation fits into what else is known about chromatin structure. For example, they want to know whether acetylation leads or follows other pretranscription activities. Does it break open the chromatin initially, or does this type of modification simply help maintain the availability of the genes as DNA is being read?

    Allis also provides a note of caution in the newfound excitement over histone acetylation. So far, he says, all the work demonstrating that the new enzymes acetylate or deacetylate histones has been done on free histones. It is less clear how HATs work on histones that are part of a nucleosome or embedded in a series of nucleosomes. And in actual nucleosomes, histone acetylation probably interacts with other types of chemical modifications, such as methylation or phosphorylation, that also aid in chromatin remodeling. How all these fit together, “nobody knows,” says Wolffe. Nevertheless, one thing is certain, Allis notes: “The field has come to the understanding that acetylation is an important part of the activation of genes.”

    Vincent Allfrey should be pleased.

    Additional Reading

    1. 1.
    2. 2.
    3. 3.
    4. 4.
    5. 5.
    6. 6.
    7. 7.
    8. 8.
    9. 9.
    10. 10.
    11. 11.
    12. 12.
    13. 13.
    14. 14.
    15. 15.
  2. Quantum Mechanics

    The Subtle Pull of Emptiness

    1. Charles Seife
    1. Charles Seife is a writer in Scarsdale, New York.

    There's no such thing as a free lunch—except in quantum mechanics. Classical physics—and common sense—dictates that the vacuum is devoid not only of matter but also of energy. But quantum mechanics often seems to depart from common sense. A paper in the current issue of Physical Review Letters describes the first successful measurement of the ultimate quantum free lunch: the Casimir force, a pressure exerted by empty space.

    The measurement, by physicist Steven Lamoreaux of Los Alamos National Laboratory, confirms the strange picture of the vacuum conceived in the 1920s by pioneering quantum physicists Max Planck and Werner Heisenberg. Even at absolute zero, they asserted, the vacuum is seething with activity. This “zero-point energy” can be thought of as an infinite number of “virtual” photons that, like unobservable Cheshire cats, wink in and out of existence—but should have a measurable effect en masse. That's what Lamoreaux has now shown. “We're excited; it confirms a very basic prediction of quantum electrodynamics,” says Ed Hinds of the University of Sussex in the United Kingdom.

    Conjuring act.

    Two closely spaced surfaces, one on a torsion pendulum, coax force from space.


    For decades after Planck and Heisenberg described the zero-point energy, physicists preferred to ignore it. It's infinite, and to a physicist, “infinity's not a very useful quantity, so we get rid of it,” says Charles Sukenik of the University of Wisconsin.

    But an early clue that these infinite fluctuations can't be ignored came in 1948, when researchers at the Philips Laboratory in the Netherlands were studying the van der Waals force—a weak attraction between neutral atoms. At long distances, the van der Waals force weakened unexpectedly. Philips scientists Hendrick Casimir and Dik Polder found that they could explain the weakening when they pictured the force as resulting from correlated zero-point fluctuations in the electric field, which would propagate from atom to atom at the finite speed of light. Because of the lag, the chance that the atoms would feel each other's fluctuations while they were still correlated would fall off at longer ranges. This weakening, called the Casimir-Polder effect, was first accurately measured in 1993, by Hinds, Sukenik, and their colleagues.

    Casimir had also realized that the zero-point energy should reveal itself more directly, as a very weak attraction between two surfaces separated by a tiny gap. Provided the gap was small enough to exclude some of the virtual photons, the crowd of photons outside the cavity would exert a minute pressure.

    To measure it, Lamoreaux positioned two gold-coated quartz surfaces less than a micrometer apart, one of them attached to a torsion pendulum while the other was fixed. The surfaces created a “box” that allowed only virtual photons of certain wavelengths to exist inside it. Outside the box, a full complement of virtual particles was merrily winking away. The infinite zero-point energy on the outside of the box outweighed the infinite (but smaller) zero-point energy inside, forcing the surfaces together.

    By counteracting this subtle attraction with piezoelectric transducers, which exert a force when a voltage is applied to them, Lamoreaux was able to measure the force. The result: a value of less than 1 billionth of a newton, agreeing with theory to within 5%.

    Hinds and others say the experiment should help physicists accept that the subatomic world is every bit as weird as quantum mechanics predicts. “We feel in our hearts that we really do understand how things work—even something as peculiar as vacuum fluctuations,” says Hinds. Adds Sussex physicist Malcolm Boshier, who was on Hinds's Casimir-Polder team: “This is one of those experiments that is going to wind up in all of the textbooks.”

  3. Cosmology

    Clouds Gather Over Deuterium Sighting

    1. James Glanz

    CHICAGO—Three floors below the ballroom where Craig Hogan spoke at the 18th Texas Symposium on Relativistic Astrophysics, held here from 15 to 20 December, a pair of old headlines in a showcase illustrated the perils of drawing conclusions from limited data. Not that there was any resemblance between the topic chosen by the University of Washington, Seattle, astronomer—the amount of deuterium in the early universe—and the 1948 presidential election. But Hogan found himself in much the same position as the Chicago Daily Tribune the day after its famous DEWEY DEFEATS TRUMAN banner: Faced with new data, he graciously withdrew an earlier conclusion.

    In the 1 March 1996 issue of Astrophysical Journal Letters, Hogan and his then-graduate student Martin Rugers had analyzed a gas cloud so far away that it likely contains material fresh out of the big bang. They concluded that it holds about one deuterium atom for every 5000 hydrogen atoms. That was a startlingly high value, because the more of this hydrogen isotope that emerged from the big bang, the lower the universe's total density of other ordinary matter must be (Science, 7 June 1996, p. 1429). Hogan and Rugers's value implied a universe so rarefied that it could barely contain the ordinary matter astronomers have observed by other means. But now, Hogan told the audience, a team led by David Tytler of the University of California, San Diego (UCSD), has obtained “clearly superior” data on the same cloud and failed to find the clues that had led Hogan and Rugers to their conclusion. “What we thought was a smoking gun of … [high] deuterium is not there.”

    Limin Lu, an astronomer at Caltech who has made similar measurements and who heard talks by Tytler and Hogan at the symposium, agrees: “The case for high deuterium is basically gone.” That may remove a puzzling discrepancy with values nearly 10 times lower, which Tytler and colleagues had measured in two other gas clouds. But Hogan's concession—unlike the Chicago Daily Tribune's—isn't the last word, because some astronomers maintain that Tytler's own measurements are not airtight.

    Down on deuterium.

    In this spectrum—the shadow of a distant gas cloud—a crucial peak in the apparent deuterium feature has vanished.


    All sides in this dispute are using a single weapon: the 10-meter Keck Telescope on Mauna Kea, in Hawaii. Its light-gathering power allows astronomers to record the “shadows” cast by gas clouds billions of light-years away in the light of brilliant quasars at even greater distances. How much light the clouds absorb—and at which wavelengths—holds clues to their composition. But deuterium and the far more abundant hydrogen in the clouds have such similar spectral signatures that disentangling the two is a delicate, not to say risky, task.

    The first widely reported deuterium measurement appeared in a 1994 Nature paper by Antoinette Songaila and Lennox Cowie of the University of Hawaii, along with Hogan and Rugers. They had studied a cloud along the line of sight to the quasar known as Q0014+813. Based on the depth of a small notch in the spectrum next to a deep well of hydrogen absorption, this group reported the 1-5000 ratio of deuterium to hydrogen atoms—but only as an upper limit. The reason for their caution, they explained, was the possibility that a small, nearby “interloper” cloud of hydrogen might be drifting through the line of sight to the main cloud. The motion of the interloper could have shifted its absorption spectrum by just enough to mimic all or part of a deuterium feature in the main cloud.

    In their later Ap. J. Letters paper, however, Rugers and Hogan concluded that this ratio could be interpreted as a firm value rather than just an upper limit. They had reanalyzed the same data and found a sharp spike of reduced absorption in the center of the apparent deuterium feature. The spike split the notch into two narrower features that looked even more like the fingerprints of deuterium. Rugers and Hogan concluded that they were seeing the shadows of high deuterium in two separate clouds. “The spike was the thing that made us think it was [purely] deuterium,” says Hogan.

    But when Tytler and UCSD's Scott Burles and David Kirkman observed the same cloud with the Keck on two nights in early November, they saw no spike at all. In retrospect, say researchers close to the project, the spike may have resulted from a technical glitch in the reanalysis. What's more, the UCSD team found that the entire notch had shifted slightly, moving it away from the frequency expected if it were caused only by deuterium. A small interloper cloud may be at least partly responsible for the hints of high deuterium after all, say researchers who have read the group's preprint.

    Tytler and co-workers, says E. Joseph Wampler of the National Astronomical Observatory in Japan, “have pretty convincingly shown that the earlier models … are incorrect, at least in detail.” But the real value of primordial deuterium is yet to be determined and could still emerge from this cloud, he says. Cowie, for example, not only raises questions about Tytler's own measurements on a different cloud but says new, unpublished data he and Songaila have gathered on Q0014+813 support their original upper limit. In cosmology as in politics, every headline is subject to change.

  4. Neutrino Detection

    Japan's ‘Super’ Site Confirms Deficit

    1. James Glanz

    CHICAGO—If neutrinos were precious gemstones, the supply of these elusive particles would scarcely be enough to satisfy holiday shoppers in one of this city's wealthier neighborhoods. But last month brought good news: Japanese researchers reported that the world's largest neutrino detector—a $100 million device located in a mine 300 kilometers west of Tokyo—has begun to collect enough data to satisfy an international clientele. These scientific customers all hope to explain why even the best source of these cosmic messengers—the sun—seems to fall short.

    Speaking at the 18th Texas Symposium on Relativistic Astrophysics, held here on 15 to 20 December, Yoji Totsuka of the University of Tokyo, spokesperson for Japan's Super-Kamiokande detector, said that the count rate at the newly constructed detector confirms a mysterious deficit in the flux of neutrinos from the sun. That deficit, which had been seen in data from Super-Kamiokande's predecessor, Kamiokande, as well as other facilities, is a crucial clue in the quest to determine whether these particles have mass—a question with profound consequences for physics and cosmology.

    Not enough.

    Super-Kamiokande is measuring half the predicted number of neutrinos from the direction of the sun.


    The data are coming in so fast, says Totsuka, that researchers should soon be able to piece together the neutrinos' energy spectrum, which may help them determine just what causes the shortfall. In just 102 days of running time since April, Super-Kamiokande “has more events than all previous solar-neutrino experiments have gotten in 30 years,” says John Bahcall of the Institute for Advanced Study in Princeton, New Jersey, another speaker at the conference. “For me, it's thrilling.”

    Neutrinos are weakly interacting particles first identified in 1956 after being posited a quarter century earlier to explain aspects of radioactive decay. They are commonly produced in many kinds of nuclear reactions, including those that power the sun and that take place when cosmic rays collide with the upper atmosphere. They come in three “flavors,” one of which—the electron neutrino—is easiest to spot with water-filled detectors like Super-K. And according to physicists' standard picture of particles and forces, they are massless.

    The solar shortfall, however, could be a sign of neutrino mass. Bahcall, who is not part of the Super-K team, and others have suggested that neutrino mass could allow the easily observed electron neutrinos to “oscillate,” or transmute, into one of the other two types as they travel from the sun's core to Earth, eluding detection. Then again, the apparent deficit might reflect an imperfect understanding of the sun's workings.

    The details of the shortfall could help physicists choose between these explanations. Like Kamiokande, Super-K detects trails of Cerenkov light given off by electrons after their rare interactions with neutrinos. With 50,000 metric tons of water and 11,200 photodetectors, Super-K is picking up about 10 neutrinos a day from the direction of the sun (see graph). That's only half the number predicted by solar theory and the Standard Model of particle physics, which assumes massless neutrinos. In that respect, said Totsuka, “we have already confirmed the old Kamiokande data. [The detections] are well below the predicted numbers.”

    Totsuka says the project is now seeking to measure the energy spectrum of the arriving neutrinos. That information, he says, should begin to emerge in a few years, after “about a factor of 5 more data.” Well before then, the team also hopes to have data on another neutrino problem: the observed shortfall of one type of atmospheric neutrino.

    The results add up to “a beautiful experiment,” says Douglas Duncan, an astronomer at the University of Chicago who chaired the neutrino session. And the data promise to keep neutrino-hungry astronomers in a holiday mood for several more shopping seasons.

  5. Earth Science

    Same Earth, Different Dynamos

    SAN FRANCISCO—The annual fall meeting of the American Geophysical Union (AGU) drew a record 7000 attendees here in December, perhaps because its unusually late scheduling let more academics get away from classes. But there was plenty taught at the meeting, too. Participants heard how Earth's molten core might generate a magnetic field, how the deep ocean mixes, and what could have triggered the ice ages.

    The planet's core is an intellectual tease. Its inner workings, cloaked in 2900 kilometers of solid rock, are discernible only in the magnetic field that guides sailors, fends off the blustery solar wind, and channels aurora to polar skies. Geophysicists have long hoped that by studying the magnetic field at the surface, they would be able to unlock the mysteries of the core. But they received a rude shock at last month's AGU meeting, when researchers running computer simulations of how that swirling ball of molten iron might produce a magnetic field reported that very different geodynamos in the core can yield the same Earth-like magnetic field at the surface.

    “We have two different [models] that are very similar outside the core but very different inside the core,” says geophysicist Gary Glatzmaier of Los Alamos National Laboratory, who in 1995 developed the first model with Paul Roberts of the University of California, Los Angeles. Jeremy Bloxham of Harvard University, who constructed the second model along with his Harvard University colleague Weijia Kuang, finds the discordant results “a little disturbing. Why are both models succeeding in generating an Earth-like field?” he wonders. To Glatzmaier, “It looks like now the surface magnetic field isn't going to be enough to say how the dynamo's working.” Other things, like observing the [solid] inner core's rotation, are going to be more important than we thought initially.”

    The two groups constructed their models in a similar way. Both used essentially the same equations to calculate the flow of highly conductive iron, much as a weather forecasting model simulates atmospheric flows, and to calculate the electric currents and magnetic forces generated when that conductor flows across magnetic fields. In addition, Glatzmaier notes, there has been a fair amount of intellectual cross-fertilization as the groups have shared students and postdocs. Yet, their results are very different.

    Deep-rooted differences.

    In two computer models of the geodynamo, the magnetic fields that reach the surface (right halves) are similar and Earth-like, while the deeply buried fields of the core are different.


    The geodynamo produced by the Harvard model “is very similar to the traditional view of how the geodynamo works,” says Peter Olson of The Johns Hopkins University. It has intense contortions of magnetic field lines, where a flowing conductor produces new field lines, lying in the outermost outer core far from the solid inner core. In contrast, Glatzmaier and Roberts's simulation puts the field generation close to the inner core, which interacts with the dynamo through magnetic field lines that pierce it.

    The most likely explanation for the differences may lie in approximations the two groups made in order to duplicate a large, complex chunk of a planet in the narrow confines of a computer model. “The problem is the viscosity of the fluid outer core—it's thought to be about that of water,” says Glatzmaier. Because such a thin fluid requires more spatial detail to simulate, putting an added burden on an already demanding computational problem, modelers compromise by exaggerating the true viscosity of the core. The results are less than perfect: “Neither of us can do it exactly the way it should be done,” says Glatzmaier.

    Then there are the parameters chosen for properties like conductivity. Glatzmaier and Roberts tended to make them realistic whenever possible, while the Harvard group took some liberties to preserve realistic proportions rather than absolute values. “They've got to take each other's parameters and run their models with them,” says Olson, which is just what the two groups will be doing. “We'll learn something that way,” says Glatzmaier, “but that won't answer which one's right.”

    For that, modelers will need some ground truth from the core itself. A good bet looks like the inner core's rate and direction of rotation, says Olson. The inner core “is like an anemometer and a weather vane at the center of the earth,” he notes; it responds to the outer core's “winds.” The recent discovery (Science, 26 July 1996, p. 428) by seismologists that the inner core is rotating eastward faster than the rest of the planet tends to support the Glatzmaier and Roberts geodynamo, which drives the model's inner core in the same direction and at roughly the same speed as observed. The inner core in the Bloxham-Kuang model has a slower relative rotation that reverses every 15,000 years or so.

    Both teams agree on one point: There's a lot more work to be done. The observational support for his competitor's model “is certainly not something I'm going to worry about until the seismology settles down a bit,” says Bloxham, noting that the range of reported rotation rates has broadened greatly as more seismologists get into the act. Glatzmaier agrees, remarking that “It's not clear to anybody which model solution is the closest to the Earth right now.” He believes it's “quite possible that the Earth experiences each of them from time to time.”

    Now that would be a real tease.

  6. Earth Science

    Elusive Ocean Mixing Nabbed

    SAN FRANCISCO—The annual fall meeting of the American Geophysical Union (AGU) drew a record 7000 attendees here in December, perhaps because its unusually late scheduling let more academics get away from classes. But there was plenty taught at the meeting, too. Participants heard how Earth's molten core might generate a magnetic field, how the deep ocean mixes, and what could have triggered the ice ages.

    Oceanographers know how the waters of the ocean part ways, but they have had a hard time identifying how and where those waters reunite. Near the poles, cold, salty surface waters sink into the deep sea, forming distinct deep layers that flow toward the equator. But somewhere along the way that cold, deep water mixes with warmer, fresher waters to form the relatively homogeneous seawater of lower latitudes. “We know mixing has to be going on somewhere in the ocean,” says physical oceanographer James Ledwell of the Woods Hole Oceanographic Institution (WHOI), “but it's been hard to find it.” Now, oceanographers think they know where at least some of the missing mixing takes place: over areas of rough bottom, which churn the overlying waters.

    The clue, reported at the AGU meeting, is the discovery of a broad zone of relatively intense mixing deep in the South Atlantic over a patch of rugged sea floor. “This is the first time such high mixing has been found over a broad area near the bottom,” says Ledwell, who with Kurt Polzin, John Toole, and Raymond Schmitt of WHOI conducted the latest search. “This is a step toward finding where significant mixing is going on in the ocean.” And because mixing changes water density in the abyssal and middle depths of the ocean, driving currents there, “our view of what [deep] circulation looks like is going to change dramatically in the next few years,” predicts physical oceanographer Eric Kunze of the University of Washington.

    Oceanographers had made unsuccessful searches where some theories predicted mixing (Science, 8 January 1993, p. 175), but key experiments remained out of reach: detailed measurements of turbulent mixing from the surface waters all the way into the abyss. Schmitt and his WHOI colleagues had designed a promising instrument package, the probe-studded High-Resolution Profiler, which measures turbulent mixing on scales of millimeters. Dropped over the side of a ship, the profiler free-falls toward the bottom, recording temperatures and flow velocities as it goes. Approaching the sea floor, the profiler's altimeter triggers the release of ballast, and the instrument package pops back to the surface. But its ability to detect subtle mixing had to be tested and some of its sensors strengthened before it was ready to probe the greatest depths.

    Last February, the WHOI group took the beefed-up profiler to the Brazil basin, just east of South America, in search of mixing. They knew that the cold, salty water entering this 5600-meter-deep basin from the south mixes with warmer water somewhere in the basin, because the bottom water that drains northward from the basin is warmer and fresher. To find just where and how the mixing happens, the WHOI researchers deployed the profiler at stations along a dogleg ship track from Rio de Janeiro to the mid-Atlantic ridge and back to Recife, Brazil.

    When they analyzed the profiles, a clear pattern emerged in the Brazil basin. In the western part of the basin, over the broad, gradual slopes of the South American continental rise and the smooth, flat abyssal plain, turbulent mixing was extremely low at all depths, contrary to predictions that mixing might be pervasive. But in the eastern basin, where the flank of the mid-Atlantic ridge roughens the bottom, turbulent mixing was far higher—five to 10 times higher at middle depths and 50 to 100 times higher in the bottom 200 meters. That may be enough mixing to account for all the warming of deep, cold water in the entire basin, says Ledwell.

    Bottom roughness seems to make the difference, says Polzin. He and his colleagues propose that as open-ocean tides drive water across the ridges and canyons of the ridge flank at upwards of 2 kilometers per hour, the uneven sea floor sets the water column undulating in waves analogous to the familiar waves seen on the surface of the ocean. When any of these internal waves break, the resulting “internal surf” drives the mixing of deeper and shallower waters. And because internal waves can set up oscillations in the fluid above them, they could boost mixing far above the bottom.

    Whether such bottom-enhanced mixing can account for all or even most of the world ocean's missing mixing, researchers can't say. Firming up the link between mixing and bottom topography would take similar measurements at other spots that have the same combination of strong tides and a rough bottom: the Juan de Fuca ridge off the U.S.-Canadian border, the Hawaiian island chain, the Micronesia archipelago, and in the Indian Ocean in the Mozambique channel, for example.

    If bottom-enhanced mixing falls short, Kunze has a favorite alternative: the intense but localized mixing driven by submarine “waterfalls” like the one Polzin and colleagues recently measured at the Romanche Fracture Zone, where deep, cold water spills through a narrow gap in the mid-Atlantic ridge, mixing with warmer water as it goes. More forays by the profiler, together with long-term mixing studies using chemical tracers, should eventually show just how the ocean puts itself back together.

  7. Earth Science

    Out of Fire, Ice?—Part 2

    SAN FRANCISCO—The annual fall meeting of the American Geophysical Union (AGU) drew a record 7000 attendees here in December, perhaps because its unusually late scheduling let more academics get away from classes. But there was plenty taught at the meeting, too. Participants heard how Earth's molten core might generate a magnetic field, how the deep ocean mixes, and what could have triggered the ice ages.

    David Rea knows as well as anyone that coincidence does not prove causation. But the tighter the coincidence between two events, the stronger the argument for a causal link. And the University of Michigan paleoceanographer says a new analysis of a sediment core from the North Pacific has strengthened the case for a link he first proposed 4 years ago: a connection between a series of volcanic eruptions that rocked the northern rim of the Pacific and the world's precipitous descent into the ice ages 2.6 million years ago.

    Earth had already been cooling for tens of millions of years, perhaps because the rise of the Himalayas affected the atmosphere and weakened the natural greenhouse effect. But 2.6 million years ago, the planet suddenly slipped over the edge into a deep chill from which it has never fully recovered. In the 1970s, a few researchers suggested that a global volcanic outburst recorded in marine sediments at roughly the same time might have triggered the climate shift by lofting debris that shaded the sun, but the records were patchy and imprecise. Now, Rea and his Michigan colleague Libby M. Prueher have shown that the eruptions around the North Pacific and the sudden cooling took place within 1000 years of each other. “It looks like the climate system just needed a kick in the pants,” says Michigan paleoceanographer Theodore Moore, “and this may have been it.”

    A climate trigger?

    Eruptions like this one from the volcano Kliuchevskoi on Kamchatka in 1994 may have brought on the ice ages.


    Rea first drew a connection between North Pacific volcanism and glaciation in 1993, when he saw cores of sediment retrieved from the far northern North Pacific. A roughly 10-fold jump in the frequency of volcanic ash layers from volcanoes up to 1000 kilometers away coincided, as best the eye could discern, with a dramatic increase 2.6 million years ago in the amount of mineral grains scoured from nearby continents by glaciation and carried to sea by rivers and icebergs. Based on a first reading of that sediment record, Rea put the two events within 50,000 to 300,000 years of each other (Science, 18 June 1993, p. 1725). That's keeping pretty close company in the geologic record, but the gap left plenty of room for doubts.

    With more precise dating and more analysis, Prueher and Rea have greatly reduced the room for doubt. They find that the best of their cores shows the northern North Pacific switching from preglacial to glacial conditions in just under 1000 years. That's too quick to be driven by other suggested climate forcing mechanisms, says Rea, such as rising mountain ranges or the changing orientation of Earth. And the abrupt climate shift continues to match up with the volcanism.

    Rea is still cautious about claiming a link. “The geologist's most serious disease is assigning cause and effect to things that occur at the same time when they may not have anything to do with each other,” he notes. To avoid contracting this dreaded syndrome, he and Prueher are undertaking an even finer dissection of the cores to see if the coincidence can be tightened still more.

Stay Connected to Science