The Internet, Policy & Politics Conferences

Oxford Internet Institute, University of Oxford

Pascal Jurgens, Birgit Stark: The Power of Default: Measuring the Impact of Platforms on Selectivity

Pascal Jurgens, Dept. of Communication, U of Mainz, Germany

Birgit Stark

The Internet has vastly expanded the media environment, but that doesn't mean recipients consume more diverse content. Quite the opposite: Selectivity has become a necessary defense against information overload. But even the most prudent, well-trained selection strategies fail in the face of today's cacophony of fast-paced news, user generated content and social media messages. In order to cope with this challenge, users increasingly rely on the filtering and sorting functions that are a staple feature of platforms. Drawing on a conceptual model of platforms as intermediaries that bias news selection, we will present empirical results that measure the degree to which users are influenced by such biases. Our contribution thus helps to quantify the actual effects of platforms on users' selectivity.

On the web, recipients' navigation is increasingly assisted by search engines, their social attention prioritized by Facebook's feed algorithm and their news aggregated by news aggregators. A growing body of literature (Hindman 2009, Bakshy et al. 2015, Boczkowski & Mitchelstein 2013) documents that the recommendations from such services are far from neutral. Rather, platforms have their own inherent biases that can potentially distort the content that reaches users. These insights have begun to draw attention from regulators who seek to safeguard economic and media diversity norms. However, existing legal frameworks concentrate on access providers and content producers. What they lack is a dedicated approach that is applicable to the intermediary roles of platforms. Making sense of the way in which platforms affect our daily lives thus requires a new conceptual framework that encapsulates their effects and integrates with existing theories.

We propose a three-component model of intermediary effects that explains their influence on users through three types of internal logic and offers a testable explanation of intermediaries' biases. According to it, platforms function

(a) as traditional gatekeepers, filtering content and thus eliminating its visibility. Such behavior may occur for example when search engines delete content at the request of government actors or citizens.

(b) Platforms are at the same time recommender systems, prioritizing content through a sorting logic, concentrating attention on the top items. Since, for example, search engine users concentrate their attention on the few top results, they will rarely if ever reach a result on the subsequent result pages (Pan et al. 2007). The criteria are formalized in algorithmic models drawing on a set of features and maximizing a mathematical metric of relevance. Finally,

(c) Platforms serve as personalized recommender systems, tailoring content to individual users' profiles, potentially fragmenting publics.

Following this model, platforms have the power to influence selective exposure with regard to news and other items. When applied to a large part of a society's information consumption, this may lead to a range of undesirable scenarios, such as polarization and fragmentation, an accelerated rise of soft news over hard news, filter bubbles and mainstreaming.

There is some evidence in the literature that supports the hypothesis of such effects. For example, search engines influence users’clicking behavior through the position in result pages, even when controlling for link quality (Pan et al. 2007). Within the social network facebook, the algorithm that generates the news feed increases selective avoidance of political content (Bakshy et al. 2015). So far, no such investigations exist for news aggregators. This is even more surprising since these platforms are prototypical for many websites and findings lend themselves to generalization more easily than they do with idiosyncratic platforms such as social networks. Amongst the existing news aggregators, Google News has established itself as a successful algorithmically operated recommendation site that focuses on media sources. The much more popular variants are, however, social aggregators such as Digg and Reddit, where users may submit links or their own content and simultaneously vote on its popularity.

In order to test the effects that news aggregator recommendations have on their users, we draw on a full data set of Reddit comments that was made available by a user. It spans the entire history of the site and comprises around 1.7 billion comments across eight years. We make use of the fact that several times throughout the history of the site, the default topics that are displayed prominently and comprise the selection on the start page are changed. Each time, the site staff decided on a set of topics that were removed or added to the defaults. Using the number of comments as a proxy for interaction, we analyze three effects:

(a) How does the inclusion of a topic into the defaults affect the attention towards a topic?

(b) How does the removal of a topic from the defaults affect the attention towards a topic?

(c) On an individual level, how does the change of default topics affect the diversity of individual content consumption?

Using bayesian counterfactual modeling, we use the unchanged topics as a reference to build a prediction of how the topics under investigation would have fared had nothing changed. We then compare the counterfactual null-model against the actual development in order to quantify the impact that the addition or removal had.

First results from the analysis show a double-digit percent change in the comment activity on the involved topics. This means that simply by changing default recommendations on a news aggregator, the site operators are able to significantly alter the content that its users perceive. The evidence from this non-reactive, full-sample measurement captures the actual influence that the site’s sorting logic has and provides support for our model of platform effects.

References

Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130’1132.

Boczkowski, P. J., & Mitchelstein, E. (2013). The News Gap: When the Information Preferences of the Media and the Public Diverge. MIT Press.

Hindman, M. (2009). The Myth of Digital Democracy. Princeton University Press.

Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., & Granka, L. (2007). In Google We Trust: Users’Decisions on Rank, Position, and Relevance. Journal of Computer-Mediated Communication, 12(3), 801’823.

Authors: 
Pascal Jurgens, Birgit Stark