Perspectives

Hope for America's next generation

See allHide authors and affiliations

Science  06 May 2016:
Vol. 352, Issue 6286, pp. 661-662
DOI: 10.1126/science.aaf7270

Embedded Image
Improving child health.

Better access to medical care may be contributing to sizable decreases in mortality rates among children in disadvantaged counties in the United States. The photo shows 5-year-old Oscar de la Cruz getting an eye exam at the nonprofit Mary's Center in Washington, D.C., on 24 February 2014.

PHOTO: LINDA DAVIDSON/THE WASHINGTON POST VIA GETTY IMAGES

A deluge of recent studies has shown that poorer communities suffer worse health outcomes. Among low-income Americans, life expectancy at age 40 in the poorest areas of the U.S. is 4.5 years lower than in the highest-income areas (1). In 2010, infant mortality rates in the poorest U.S. communities were over 70% higher than those in the most affluent ones [see tables S3 and S4 in (2)]. On page 708 of this issue, Currie and Schwandt paint a more complicated but encouraging picture (2). They show that, despite rising inequality in almost every dimension of American life, the child mortality gap between the poorest and the richest counties has shrunk in recent decades.

The authors find that between 1990 and 2010, life expectancy at birth rose for both women and men in communities across the United States. Mortality rates in all 5-year age groups of American children, from infants to late adolescents, fell dramatically. Perhaps more surprisingly, the poorest counties saw the largest absolute decreases in child mortality rates, almost twice as large as those in the richest counties. These important findings challenge the conventional wisdom that health has been uniformly worsening in more disadvantaged communities. They also challenge the idea that further increases in longevity in wealthy countries depend only on improvements among the middle-aged or elderly (3).

In their study, Currie and Schwandt calculate mortality rates of 5-year age groups and compare counties with different poverty rates. This method picks up nuances in U.S. health trends that are not captured by national life expectancy, a standard but relatively crude measure of health. It also allows them to zoom in on changes at different points in the age distribution and in counties across the country. Of course, the results do not imply that poverty causes shorter lives. They should be seen as demonstrating a relationship between income and health, rather than as a statement that one causes the other.

The good news is the persistent gains in life expectancy that young Americans have experienced in the last 20 years, even as economic and health inequality among their parents and grandparents has grown. Economists do not always agree on how economic conditions affect health (4). Still, readers may not find it surprising that child mortality rates fell sharply during the economic boom of the 1990s. It may be more surprising that child mortality rates continued to fall over the next decade, even though the U.S. economy shed 8 million jobs and the number of Americans without health insurance rose by 5.7 million between 2007 and 2010 (5, 6).

The Generational Divide

Between 1990 and 2010, child mortality rates fell fastest in the poorest areas of the United States, narrowing the gap between high-poverty and low-poverty counties. This trend (left) is an encouraging counterpoint to the growing mortality-rate gap among older Americans (right). Data from table S3 in (2).

GRAPHIC: C. SMITH/SCIENCE

The scale of these improvements in the most disadvantaged places in the country is impressive. Consider Genesee County, Michigan, home to the city of Flint and one of the poorest U.S. counties. Between 1990 and 2010, the county's poverty rate rose from 16 to 20%. At the same time, the 3-year moving average of infant mortality rates fell by five children per 1000 born to 7.8. In nearby Oakland County, home to more affluent Detroit suburbs with a poverty rate of ~10%, infant mortality fell by a smaller but still impressive two children per 1000 births to 5.8. This more than halved the gap from five to two deaths per 1000 births. As Currie and Schwandt show, this pattern held in almost all counties in the United States and for all groups of children under 20, resulting in much reduced disparities in child mortality (see the figure).

Not all their results are as encouraging. Corroborating recent studies (2, 7), the authors show that mortality trends are less favorable for Americans over age 40. Improvements in older-adult health have been much larger in more affluent areas, reflecting a growing mortality gap among older adults.

Currie and Schwandt are largely silent on the causes of these changes. Although it is unlikely that they can be attributed to a single factor, the different trends in children and adults help narrow the search. Oft-cited explanations for growing adult health disparities include the scarcity of good jobs, rising income inequality, deteriorating infrastructure, and worsening health behaviors (including lack of exercise, smoking, and opioid abuse). Some argue that this toxic cocktail is killing working-age Americans (2, 7). This makes the improvement in mortality among young Americans all the more impressive, because child health is adversely affected by many of the same factors (8, 9).

More promising explanations relate to the substantial expansions in public health programs for children over the past 25 years. The expansion of Medicaid—a program shoring up health insurance in the most disadvantaged areas—is associated with significant declines in infant mortality (10). Expansions of the State Children's Health Insurance Program may also have disproportionately improved health in poorer areas (11). Similarly, expansion of programs like Community Health Centers would have benefited poor communities most (12). These improvements in health care access amplified the impact of advances in medical care and health information over the same period (such as the treatment of congenital defects and prevention of sudden infant death syndrome).

Whatever the cause of the decline in health inequality among children, improved child survival has implications for the interpretation of mortality trends at older ages. The logic is simple: If the health of the children who are saved tends to be more fragile, this could result in higher death rates at older ages. For instance, the surviving children may be more likely to die in their 40s. This “selection” process—to use the language of economists—would predict modest increases in mortality rates at older ages, even though some children lived longer. Rising mortality rates for 45- to 55-year-olds may thus be at least partially attributable to gains in longevity among children 40 to 50 years ago (13), as well as among younger adults more recently.

Even better news is that patterns in health inequality seem to be changing for the better. Infants born in the most disadvantaged counties in 2000 were much more likely to survive than those born in 1990. Ten years later, in 2010, the mortality rates for the same cohorts were still much lower than those of the previous generation. If these declines in mortality persist, they will represent a powerful new trend with important policy implications. So far, at least, the new generation of Americans is defying the pattern of growing inequality that has defined their parents' and grandparents' generations.

References and Notes

  1. Bureau of Labor Statistics, U.S. Department of Labor, Employment, Hours, and Earnings from the Current Employment Statistics Survey, see http://data.bls.gov/pdq/SurveyOutputServlet?request_action=wh&graph_name=CE_cesbref1.
Acknowledgments: We gratefully acknowledge the use of the services and facilities of the Population Studies Center (PSC) at the University of Michigan (R24 HD041028). During work on this project, B.T. was supported by the NIA as a PSC Trainee (T32 AG000221) and by the NICHD (R01 HD070950-02).

Subjects

Navigate This Article