PerspectiveSystems Biology

How Information Theory Handles Cell Signaling and Uncertainty

See allHide authors and affiliations

Science  19 Oct 2012:
Vol. 338, Issue 6105, pp. 334-335
DOI: 10.1126/science.1227946

Intracellular biochemical networks have traditionally been studied by stimulating populations of genetically identical cells and measuring the aggregate response. However, such population-based measurements may obscure the idiosyncrasies of individual cells and therefore suggest deceptively precise input-output relationships. Consequently, signaling pathways have been viewed as the finely tuned circuitry that programs the cell to behave in a predefined manner (1). Detailed studies of cellular biochemistry at the single-cell level now show that cells responding en masse may have quite varied behaviors when examined individually (2), raising the question of how precisely signaling pathways can control a cell's actions (3).

It is likely that under most circumstances, cells in populations constantly diversify their states (e.g., the response of their signaling networks) (4). Thus, there is an uncertainty in signaling outcomes, and responses of cells randomly selected from a population are unpredictable. In this sense, the match between the environmental input and the cellular output can no longer be predefined and stereotypically precise (5). This hampers the ability of individual cells or small cell ensembles to make appropriate decisions in fluctuating environments, and requires a fundamentally different view toward analyzing signaling systems. Hence, rather than relying on seemingly robust and sensitive signaling input-output dependencies to analyze networks and cell behavior, we should instead seek to learn the limits to how well cell signaling can enable decision-making, given a cell's uncertain response to changes in the environment.

Variability in cell response is frequently referred to as “noise,” and current metrics to characterize noise report on its magnitude but do not quantify how the noise limits the cell's decision-making abilities (6). Indeed, performance of a signaling network depends on more than just the level of noise in its underlying chemistry. For instance, signaling may allow a population of cells to simultaneously sample several distinct classes of behavior—a type of cellular bet hedging—which can improve some aspects of decision-making but with a cost of increased variability (7). Therefore, a new “language” may be needed to understand and quantify the impact of noise (variability) on a cell's functionality.

Mathematics turns out to have just the right theory. This theory has already been adopted to understand the workings of another type of “noisy” signaling network, the nervous system (8). Created to analyze uncertainty in human communication, information theory enables the limits of decision-making fidelity to be rigorously defined and measured (9). Conveniently, its general formulation permits analysis of many complex systems, including those found in biological signaling (10). Within this theory and in the context of signaling, information is quantified as the uncertainty about the environment that is removed by signaling activity (which is equivalent to the knowledge gained by the signaling system). The amount of information depends on both the amount of variability in the environment (the initial level of uncertainty) and noise in the signaling process itself (affecting the amount of uncertainty remaining). Extending this definition, we can also determine the information capacity of a system, which is the maximum information that a signaling system can obtain about some aspect of the environment under ideal conditions. This capacity is an intrinsic property of the signaling system, as much as the underlying chemistry, in that it is the key determinant of achievable decision-making fidelity (11).

As an example, consider a signaling pathway whose output measures the concentration of an extracellular ligand (i.e., a dose response). Signaling noise prevents a cell from determining the precise ligand concentration. However, does the noise also prevent a cell from resolving different concentrations of the ligand, and if so, how many and how accurately? Information theory states that it is possible to use the noisy signaling output to accurately discriminate different input doses (11). Furthermore, the number of resolvable concentrations is limited and is a simple function of the pathway capacity (12). Alternatively, if mistakes do occur, the capacity determines the minimum amount of error that a cell must tolerate, with higher capacity unambiguously allowing for lower error (13). Information theory allows such categorical statements without necessarily requiring detailed specifics of the signaling network organization and operation, and thus can be used to analyze the capabilities of complex and incompletely characterized biological systems.

Currently, we do not understand the decision-making limits of the vast majority of signaling systems, even those affected by variability. Consequently, the factors that affect and regulate those limits are also generally unknown. Thus, from the standpoint of information transfer, it is essential to determine the capacities of these signaling pathways and networks and the relationships between system structure and capacity. For instance, information lost at each step of processing should prevent information sources and destinations from being separated by more than a few intermediates (14, 15). Simultaneously, it is often necessary to integrate multiple pieces of information within a cell. Both of these considerations drive the formation of so-called small world networks that are widespread in biological systems and other networks, in which a relatively short path connects any two signaling nodes (16). Such networks are configured so that multiple signals pass through central nodes, thereby raising the information theoretic question of how the signals are multiplexed through the hub so as to minimally interfere with one another (17).

Acquiring information typically costs the cell energy, time, or opportunity, so a signaling system that collects more information than is necessary or ignores information that is easily obtained wastes valuable resources. Therefore, under evolutionary pressure, it's expected that signaling systems are optimally matched to the sources of information they have evolved to process. Indeed, examples from neuroscience (such as sensory perception) and developmental biology (such as embryonic patterning) show that biological systems usually have a capacity that is minimally sufficient for the information they process (18, 19).

This optimality principle can answer long-standing questions that cannot currently be addressed through models or direct experiments. One example is determining which of several putative aspects of environmental input (e.g., ligand dose, rate of dose change, or duration of ligand presentation) are biologically relevant. Information-processing optimality suggests that the aspect of the input associated with a higher capacity is the more pertinent one. This concept further implies that the conditions that maximally utilize the information capacity of a sensory system should reflect the natural fluctuations in the environment. These conditions can be computed from controlled laboratory observations, enabling a form of “inverse ecology” that is sometimes the only feasible way to gain insight into a cell's natural surroundings (20) (see the figure). Similar arguments can be used to infer which aspect of a cell's response to an environmental input is most relevant to that input (21).

Inverse ecology.

To determine the frequency with which a stimulus (such as a ligand) occurs in a cell's natural environment, a population of cells (genetically identical) can be exposed to the ligand in an experimental setting. Individual cell responses can be measured at a range of doses; that information can be used to discover information about a cell's natural environment. There is variation in response at each dose (noise). Mathematically, such experimental measurements provide a conditional probability distribution [P (response|signal)]. It is assumed that a cell is “optimized” to obtain the most information about the frequency of doses it encounters in its natural environment [P(signal)]. Information theory (IT) determines the P(signal) that maximizes the information capacity of the signaling network that is responsive to the particular ligand; this reflects the dose frequency in the cell's natural environment.

CREDIT: C. BICKEL/SCIENCE

As biologists are finding more uses for information theory, it is evolving to handle more complex biological networks (2225). By rigorously quantifying the properties and limits of cellular information transfer, new questions will be formulated and answered within the single-cell paradigm shift that is already underway. As the ability to quantify the functionality of signaling networks improves, we will hopefully gain insight into the details of their underlying chemistry while gaining a deeper understanding of their higher-level organization and functionality.

References and Notes

  1. Acknowledgments: We thank I. Nemenman for discussions and advice. This work is supported by NIH grants GM072024, GM084322, and CA65145.
View Abstract

Navigate This Article