PerspectiveEvolution

Reducible Complexity

See allHide authors and affiliations

Science  07 Apr 2006:
Vol. 312, Issue 5770, pp. 61-63
DOI: 10.1126/science.1126559

If an elaborate lock fits an equally elaborate key, we immediately sense the purpose of design: The key was crafted with the idea of the lock in mind. We would not entertain the possibility that the match is accidental. When we come upon such lock-and-key pairs in nature, it is natural to ask how these pairs could have evolved via Darwinian evolution. At first glance, it seems that the key can only evolve to fit the lock if the lock is already present, and the lock cannot evolve except in the presence of the key (because without the key, it does not open). On page 97 of this issue, Bridgham et al. (1) take a closer look at this puzzle and discover a different answer in the molecular evolution of hormone-receptor interactions.

Charles Darwin was fully aware of the problems that such lock-and-key systems—should they exist in biology—would present to his theory because the theory relies upon step-by-step changes to a trait. Building a lock-and-key system appears to require at least two changes to happen simultaneously. He famously remarked that “if it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous successive slight modifications, my theory would absolutely break down” (2). This concern has been seized upon by proponents of an “intelligent design” alternative to Darwinian evolution that proposes that complex systems—like those that display lock-and-key complexity—cannot evolve. The premise for the argument is that systems of a lock-and-key nature cannot evolve and are thus “irreducibly complex” (3), implying that only the lock-and-key combination, but not its parts, is complex. The argument continues that because such systems do exist in nature, and cannot have evolved, they must have been “designed.”

Darwin already saw how such thorny issues could be resolved. He further explains in The Origin of Species that “if we look to an organ common to all the members of a large class…in order to discover the early transitional grades through which the organ has passed, we should have to look to very ancient ancestral forms, long since become extinct.” In other words, Darwin suspected that viewing only the extant complex forms will obscure the path of evolution, and present an incomplete picture. But while the fossil record has yielded many intermediate forms that suggest a continuous evolution of traits, it is too often incomplete, and does not allow us to retrace the molecular history of a gene. Reconstructing the complete evolutionary history of a complex genetically encoded function (albeit a “computational” one) was achieved recently (4), and it experimentally vindicated Darwin's idea that the target of natural selection constantly changes, so that the complex feature of today may share very little with the original function. But while such computational investigations can be very satisfying, they might not convince everybody. It is therefore gratifying that it is now possible to reconstruct the ancestral genes of an existing species so that, as Darwin urged us to do, we can “look exclusively to its lineal ancestors” to understand a gene's evolution.

Molecular evolution of a biological lock and key.

A two-dimensional schematic picture of an ancestral hormone receptor that binds aldosterone, cortisol, and DOC. The L111Q mutation in the receptor is drastic because it eliminates receptor activation by any of the three molecules, modeled by an obstruction of the binding pocket. The mutation S106P, on the other hand, does not affect the binding of DOC, but both aldosterone and cortisol can bind only very loosely. However, the presence of both mutations allows cortisol to bind strongly again, whereas aldosterone no longer fits.

Bridgham et al. address one of the central concepts of the intelligent design argument. They did not study just any gene, but precisely a system that looks irreducibly complex: a hormone-receptor pair that we can think of as a biological lock and key. In vertebrates, the regulation of many cellular processes is controlled by steroid-receptor interactions that are highly specific. For example, cortisol activates the glucocorticoid receptor to regulate metabolism, inflammation, and immunity. In contrast, the mineralocorticoid receptor is activated by aldosterone, and controls electrolyte homeostasis, among other effects. This specificity is important, because the activation of the glucocorticoid receptor by aldosterone, for example, would be highly detrimental.

Phylogeny tells us that an ancestral corticoid receptor gave rise to the glucocorticoid receptor and the mineralocorticoid receptor in a gene-duplication event more than 450 million years ago. However, aldosterone evolved much later. Without aldosterone present, how could the mineralocorticoid receptor evolve to be activated by it? Doesn't the pair's specificity require the evolution of two traits at the same time, an event that appears highly unlikely?

Bridgham et al. took Darwin's advice and followed the line of descent to the ancestral corticoid receptor. Modern phylogenetic methods make it possible to reconstruct such inferred sequences and study the properties of these molecules in the laboratory. What the authors find is a surprise: Not only is the ancestral corticoid receptor sensitive to cortisol as expected, it is also activated by 11-deoxycorticosterone (DOC) and aldosterone. Because aldosterone was not present at the time, this sensitivity must be a by-product of sensitivity to another steroid, a promiscuity that can be exploited by evolution (5).

The next task was to determine how the mineralocorticoid receptor kept the aldosterone specificity, whereas the glucocorticoid receptor lost it. This is a tale of two mutations. More phylogenetic analysis revealed that precisely two amino acid substitutions resulted in the glucocorticoid receptor phenotype—aldosterone insensitivity and cortisol (and DOC) sensitivity. Could these two mutations have occurred one after the other? Bridgham et al. tested the effect of each of these mutations—replacement of leucine-111 with glutamine (L111Q) and replacement of serine-106 with proline (S106P)—alone on the reconstructed ancestral corticoid receptor and in the presence of the other mutation (see the figure). Of the two mutations, L111Q was the more damaging: Applying this mutation to the ancestral receptor destroyed its sensitivity to all three hormones. On the other hand, the S106P change reduced receptor activation by aldosterone and cortisol but did not change the sensitivity to DOC. In the presence of S106P, the effect of L111Q was quite different: It removed any sensitivity to aldosterone, and restored cortisol sensitivity. In other words, it produced the glucocorticoid receptor phenotype. The two mutations thus turned out to be strongly epistatic: Both reduce the fitness of the system (L111Q very strongly so), but together their effect is neutral or better.

Can we determine the order in which these mutations appeared and can we understand how such epistatic effects arise? Structural changes very easily can lead to the type of epistatic interactions between mutations now documented in hormone receptor evolution, because such changes can condition the mutational effect. Thus, single mutations that confer different structural changes that depend on one another can conspire to give the impression of irreducible complexity. Although the mutation L111Q creates a possibly lethal phenotype when it occurs alone in the ancestral corticoid receptor, it confers the glucocorticoid receptor phenotype if it is preceded by the S106P mutation, which itself is nonlethal. Such interacting pairs of mutations are common and important in evolution.

Bridgham et al. conclude that the insensitivity of the glucocorticoid receptor to aldosterone most likely evolved by the S106P mutation followed by the L111Q mutation because the intermediate phenotype is still viable. Although this is the most parsimonious conclusion, the other sequence of mutation events cannot be ruled out. Indeed, the experiments following the line of descent of digital organisms in Lenski et al. (3) found, surprisingly, that occasional highly deleterious mutations were rescued by a partner mutation that conferred a beneficial trait. Thus, the highly deleterious partner of the pair can indeed come first, as long as the second mutation does not occur too late. In any case, the evidence is clear that such “multiresidue features” (6) can and do evolve. Understanding how they evolve requires taking into account complex epistatic interactions that allow intermediate nonlethal states that might not appear obvious at first glance.

The Bridgham et al. and Lenski et al. (4) studies are of particular scientific interest, given the political attention given to intelligent design lately. Although these authors have not directly addressed this controversy in the discussion of their work—because the work itself is intrinsically interesting to biologists—such studies solidly refute all parts of the intelligent design argument. Those “alternate” ideas, unlike the hypotheses investigated in these papers, remain thoroughly untested. Consequently, whatever debate remains must be characterized as purely political.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.

Navigate This Article