Report

Exposure to ideologically diverse news and opinion on Facebook

See allHide authors and affiliations

Science  05 Jun 2015:
Vol. 348, Issue 6239, pp. 1130-1132
DOI: 10.1126/science.aaa1160

Not getting all sides of the news?

People are increasingly turning away from mass media to social media as a way of learning news and civic information. Bakshy et al. examined the news that millions of Facebook users' peers shared, what information these users were presented with, and what they ultimately consumed (see the Perspective by Lazer). Friends shared substantially less cross-cutting news from sources aligned with an opposing ideology. People encountered roughly 15% less cross-cutting content in news feeds due to algorithmic ranking and clicked through to 70% less of this cross-cutting content. Within the domain of political news encountered in social media, selective exposure appears to drive attention.

Science, this issue p. 1130; see also p. 1090

Abstract

Exposure to news, opinion, and civic information increasingly occurs through social media. How do these online networks influence exposure to perspectives that cut across ideological lines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact with socially shared news. We directly measured ideological homophily in friend networks and examined the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. We then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked News Feed and further studied users’ choices to click through to ideologically discordant content. Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content.

Exposure to news and civic information is increasingly mediated through online social networks and personalization (1). Information abundance provides individuals with an unprecedented number of options, shifting the function of curating content from newsroom editorial boards to individuals, their social networks, and manual or algorithmic information sorting (24). Although these technologies have the potential to expose individuals to more diverse viewpoints (4, 5), they also have the potential to limit exposure to attitude-challenging information (2, 3, 6), which is associated with the adoption of more extreme attitudes over time (7) and misperception of facts about current events (8). This changing environment has led to speculation around the creation of “echo chambers” (in which individuals are exposed only to information from like-minded individuals) and “filter bubbles” (in which content is selected by algorithms according to a viewer’s previous behaviors), which are devoid of attitude-challenging content (3, 9). Empirical attempts to examine these questions have been limited by difficulties in measuring news stories’ ideological leanings (10) and measuring exposure—relying on either error-laden, retrospective self-reports or behavioral data with limited generalizability—and have yielded mixed results (4, 9, 1115).

We used a large, comprehensive data set from Facebook that allows us to (i) compare the ideological diversity of the broad set of news and opinion shared on Facebook with that shared by individuals’ friend networks, (ii) compare this with the subset of stories that appear in individuals’ algorithmically ranked News Feeds, and (iii) observe what information individuals choose to consume, given exposure on News Feed. We constructed a deidentified data set that includes 10.1 million active U.S. users who self-report their ideological affiliation and 7 million distinct Web links (URLs) shared by U.S. users over a 6-month period between 7 July 2014 and 7 January 2015. We classified stories as either “hard” (such as national news, politics, or world affairs) or “soft” content (such as sports, entertainment, or travel) by training a support vector machine on unigram, bigram, and trigram text features (details are available in the supplementary materials, section S1.4.1). Approximately 13% of these URLs were classified as hard content. We further limited the set of hard news URLs to the 226,000 distinct hard-content URLs shared by at least 20 users who volunteered their ideological affiliation in their profile, so that we could accurately measure ideological alignment. This data set included ~3.8 billion potential exposures (cases in which an individual’s friend shared hard content, regardless of whether it appeared in her News Feed), 903 million exposures (cases in which a link to the content appears on screen in an individual’s News Feed), and 59 million clicks, among users in our study.

We then obtained a measure of content alignment (A) for each hard story by averaging the ideological affiliation of each user who shared the article. Alignment is not a measure of media slant; rather, it captures differences in the kind of content shared among a set of partisans, which can include topic matter, framing, and slant. These scores, averaged over websites, capture key differences in well-known ideologically aligned media sources: FoxNews.com is aligned with conservatives (As = +.80), whereas the HuffingtonPost.com is aligned with liberals (As = –0.65) (additional detail and validation are provided in the supplementary materials, section S1.4.2). We observed substantial polarization among hard content shared by users, with the most frequently shared links clearly aligned with largely liberal or conservative populations (Fig. 1).

Fig. 1 Distribution of ideological alignment of content shared on Facebook measured as with the average affiliation of sharers weighted by the total number of shares.

Content was delineated as liberal, conservative, or neutral on the basis of the distribution of alignment scores (details are available in the supplementary materials).

The flow of information on Facebook is structured by how individuals are connected in the network. The interpersonal networks on Facebook are different from the segregated structure of political blogs (16); although there is clustering according to political affiliation on Facebook, there are also many friendships that cut across ideological affiliations. Among friendships with individuals who report their ideological affiliation in their profile, the median proportion of friendships that liberals maintain with conservatives is 0.20, interquartile range (IQR) [0.09, 0.36]. Similarly, the median proportion of friendships that conservatives maintain with liberals is 0.18, IQR [0.09, 0.30] (Fig. 2).

Fig. 2 Homophily in self-reported ideological affiliation.

Proportion of links to friends of different ideological affiliations for liberal, moderate, and conservative users. Points indicate medians, thick lines indicate interquartile ranges, and thin lines represent 10th to 90th percentile ranges.

How much cross-cutting content individuals encounter depends on who their friends are and what information those friends share. If individuals acquired information from random others, ~45% of the hard content that liberals would be exposed to would be cross-cutting, compared with 40% for conservatives (Fig. 3B). Of course, individuals do not encounter information at random in offline environments (14) nor on the Internet (9). Despite the slightly higher volume of conservatively aligned articles shared (Fig. 1), liberals tend to be connected to fewer friends who share information from the other side, compared with their conservative counterparts: Of the hard news stories shared by liberals’ friends, 24% are cross-cutting, compared with 35% for conservatives (Fig. 3B).

Fig. 3 Cross-cutting content at each stage in the diffusion process.

(A) Illustration of how algorithmic ranking and individual choice affect the proportion of ideologically cross-cutting content that individuals encounter. Gray circles illustrate the content present at each stage in the media exposure process. Red circles indicate conservatives, and blue circles indicate liberals. (B) Average ideological diversity of content (i) shared by random others (random), (ii) shared by friends (potential from network), (iii) actually appeared in users’ News Feeds (exposed), and (iv) users clicked on (selected).

The media that individuals consume on Facebook depends not only on what their friends share but also on how the News Feed ranking algorithm sorts these articles and what individuals choose to read (Fig. 3A). The order in which users see stories in the News Feed depends on many factors, including how often the viewer visits Facebook, how much they interact with certain friends, and how often users have clicked on links to certain websites in News Feed in the past. We found that after ranking, there is on average slightly less cross-cutting content: The risk ratio comparing the probability of seeing cross-cutting content relative to ideologically consistent content is 5% for conservatives and 8% for liberals (supplementary materials, section S1.7).

Individual choice futher limits exposure to ideologically cross-cutting content. After adjusting for the effect of position [the click rate on a link is negatively correlated with its position in the News Feed (fig. S5)], we estimated the risk ratio comparing the likelihood that an individual clicks on a cross-cutting content relative to a consistent content to be 17% for conservatives and 6% for liberals, a pattern that is consistent with prior research (4, 17). Despite these tendencies, there is substantial room for individuals to consume more media from the other side; on average, viewers clicked on 7% of hard content available in their feeds.

Our analysis has limitations. Although the vast majority of U.S. social media users are on Facebook (18), our study is limited to active users who volunteer an ideological affiliation on this social media platform. Facebook’s users tend to be younger, more educated, and more often female as compared with the U.S. population as a whole (18). Other forms of social media, such as blogs or Twitter, have been shown to exhibit different patterns of homophily among politically interested users, largely because ties tend primarily to form based on common topical interests and/or specific content (16, 19), whereas Facebook ties primarily reflect many different offline social contexts: school, family, social activities, and work, which have been found to be fertile ground for fostering cross-cutting social ties (20). In addition, our distinction between exposure and consumption is imperfect; individuals may read the summaries of articles that appear in the News Feed and therefore be exposed to some of the articles’ content without clicking through.

This work informs long-standing questions about how media exposure is shaped by our social networks. Although partisans tend to maintain relationships with like-minded contacts [which is consistent with (21)], on average more than 20% of an individual’s Facebook friends who report an ideological affiliation are from the opposing party, leaving substantial room for exposure to opposing viewpoints (22, 23). Furthermore, in contrast to concerns that people might “listen and speak only to the like-minded” while online (6), we found exposure to cross-cutting content (Fig. 3B) along a hypothesized route: traditional media shared in social media (4, 24). Perhaps unsurprisingly, we show that the composition of our friend networks is the most important factor limiting the mix of content encountered in social media. The way that sharing occurs within these networks is not symmetric: Liberals tend to be connected to fewer friends who share conservative content than are conservatives (who tend to be linked to more friends who share liberal content).

Within the population under study here, individual choices (2, 13, 15, 17) more than algorithms (3, 9) limit exposure to attitude-challenging content in the context of Facebook. Despite the differences in what individuals consume across ideological lines, our work suggests that individuals are exposed to more cross-cutting discourse in social media than they would be under the digital reality envisioned by some (2, 6). Rather than people browsing only ideologically aligned news sources or opting out of hard news altogether, our work shows that social media expose individuals to at least some ideologically cross-cutting viewpoints (4). Of course, we do not pass judgment on the normative value of cross-cutting exposure. Although normative scholars often argue that exposure to a diverse “marketplace of ideas” is key to a healthy democracy (25), a number of studies have found that exposure to cross-cutting viewpoints is associated with lower levels of political participation (22, 26, 27). Regardless, our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.

Supplementary Materials

www.sciencemag.org/content/348/6239/1130/suppl/DC1

Materials and Methods

Supplementary Text

Figs. S1 to S10

Tables S1 to S6

References (2835)

References and Notes

  1. Acknowledgments: We thank J. Bailenson, D. Eckles, A. Franco, K. Garrett, J. Grimmer, S. Iyengar, B. Karrer, C. Nass, A. Peysakhovich, S. Taylor, R. Weiss, S. Westwood, J. M. White, and anonymous reviewers for their valuable feedback. The following code and data are archived in the Harvard Dataverse Network, http://dx.doi.org/10.7910/DVN/LDJ7MS: “Replication Data for: Exposure to Ideologically Diverse News and Opinion on Facebook”; R analysis code and aggregate data for deriving the main results (tables S5 and S6); Python code and dictionaries for training and testing the hard-soft news classifier; aggregate summary statistics of the distribution of ideological homophily in networks; and aggregate summary statistics of the distribution of ideological alignment for hard content shared by the top 500 most shared websites. The authors of this work are employed and funded by Facebook. Facebook did not place any restrictions on the design and publication of this observational study, beyond the requirement that this work was to be done in compliance with Facebook’s Data Policy and research ethics review process (www.facebook.com/policy.php).
View Abstract

Navigate This Article