On the web, news media are profiling and targeting news users in order to serve them with news stories and advertisements that match their individual interests. For example, the New York Times uses algorithms to give readers story recommendations based on their reading history, or to hide stories they have already read. European news organisations, like the BBC in the UK and de Volkskrant in the Netherlands, as well as newer players like Facebook, are experimenting with personalisation as well.
News personalisation can allow media companies to better serve their users by, for example, helping them deal with information overload or by serving them more interesting content. Monetizing the resulting increase in attention from users can also have financial benefits for the industry. However, the increasing personalisation of news also raises concerns about the interests of media users, the role of the media in society, and the media’s relationship with news readers. For example, what are the users’ concerns with regard to personalisation, and under which conditions are they most likely to accept it? Who controls the algorithm which is in charge of what users see? What rights do users have in this process, and how does the introduction of personalisation interact with traditional media values? And how can the trust of the public be safeguarded in light of these changes?
The personalized news project combines legal and empirical research to answer such questions. The project will provide its answers in three stages. Firstly, the postdocs explore how personalisation affects the relationship between media users and the media. Informed by these findings, two PhD candidates investigate the legal rules on users’ rights of freedom of expression and privacy, as well as the media’s editorial responsibility in the context of personalised news. Finally, the principal investigator will develop a new normative framework through which news personalisation can be assessed, exploring the extent to which the government could stimulate new media strategies for personalised, public interest content.
The research team investigates personalisation from the perspective of both the media and the users and consists of one professor of information law, two postdoctoral researchers, and two legal PhD students. Specific information about them and their work can be found on their profile pages, which are linked below, or on the personalized communications team page.
Natali Helberger is Professor of Information Law, with a special focus on the use of information, at the Institute for Information Law (IViR). She leads the personalized news project and will develop a normative framework of personalised media based on the empirical research and political theories about the democratic role of the media.
Judith Möller is a postdoctoral researcher at the Amsterdam School of Communication Research (ASCoR). Her research in the personalized news project focuses on user attitudes towards implicit media personalisation as well as the conditions that affect users’ acceptance of media personalisation.
Mariella Bastian is a journalism scholar and former journalist. Building on her earlier research into media accountability and editorial responsibility, her work for the personalised communications project focuses on personalisation’s impact on newsroom structures, as well as the role journalistic values play during the design of personalisation algorithms.
Sarah Eskens is a PhD candidate at the Institute for Information Law (IViR). She researches the legal consequences of news personalisation from the perspective of the users, with a focus on the way in which rules of privacy and data protection law can protect their right to receive information.
Max van Drunen is a PhD candidate at the Institute for Information Law (IViR). His research explores how personalisation impacts editorial responsibility in the news industry, and how this concept can legally be furthered and protected in the context of personalisation.
Previous team members
Balázs Bodó is a socio-legal researcher at the Institute for Information Law (IViR). In his research for the personalized news project he explored the mechanisms behind and the attitudes, concerns and economic/ethical consideration for the news media, and how these translate into journalistic practices, editorial guidelines and emerging algorithmic journalistic ethics.
Joyce Neys’s research for the personalised news project focused on the perspective of the user and investigated the conditions that influence the social acceptability and rejection of tracking and targeting in order to contribute to a further (re)conceptualization of the democratic contribution of news media in contemporary society from a user perspective. By exploring the different practices that different users adopt in the context of news personalisation, she shed light on the way in which these strategies inform our understanding of what it means to be a citizen in contemporary democratic society.
23-11-2018 Philosophical Transactions of the Royal Society A, 1364–503X, 1-21 376 2135
The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people’s news provision.
Helberger, N. Möller, J. Thurman, N. Trilling, D.
19-10-2018 Digital Journalism 2018
Prompted by the ongoing development of content personalization by social networks and mainstream news brands, and recent debates about balancing algorithmic and editorial selection, this study explores what audiences think about news selection mechanisms and why. Analysing data from a 26-country survey (N = 53,314), we report the extent to which audiences believe story selection by editors and story selection by algorithms are good ways to get news online and, using multi-level models, explore the relationships that exist between individuals’ characteristics and those beliefs. The results show that, collectively, audiences believe algorithmic selection guided by a user’s past consumption behaviour is a better way to get news than editorial curation. There are, however, significant variations in these beliefs at the individual level. Age, trust in news, concerns about privacy, mobile news access, paying for news, and six other variables had effects. Our results are partly in line with current general theory on algorithmic appreciation, but diverge in our findings on the relative appreciation of algorithms and experts, and in how the appreciation of algorithms can differ according to the data that drive them. We believe this divergence is partly due to our study’s focus on news, showing algorithmic appreciation has context-specific characteristics.
Helberger, N.Möller, J.
05-07-2018 Report drafted for the Dutch Media Regulator (Commissariaat voor de Media)
In recent years, we have been witnessing a fundamental shift in the form how news and current affairs are disseminated and mediated. Due to the exponential increase in available content online and technological development in the field of recommendation systems, more and more citizens are informing themselves through customized and curated sources, while turning away from mass-mediated information sources like TV news and newspapers. Algorithmic recommendation systems provide news users with tools to navigate the information overload and identify important and relevant information. They do so by performing a task that was once a key part of the journalistic profession: keeping the gates. In a way, news recommendation algorithm can create highly individualized gates, through which only information and news fit that serves the user best. In theory, this is a great achievement that can make news exposure more efficient and interesting. In practice, there are many pitfalls when the power to select what we hear from the news shifts from professional editorial boards that select the news according to professional standards to opaque algorithms who are reigned by their own logic, the logic of advertisers or consumes personal preferences.
25-06-2018, LSE Media Policy Blog
Es, B. van Helberger, N. Möller, J. Trilling, D.
08-03-2018 Information, Communication & Society
In the debate about filter bubbles caused by algorithmic news recommendation, the conceptualization of the two core concepts in this debate, diversity and algorithms, has received little attention in social scientific research. This paper examines the effect of multiple recommender systems on different diversity dimensions. To this end, it maps different values that diversity can serve, and a respective set of criteria that characterizes a diverse information offer in this particular conception of diversity. We make use of a data set of simulated article recommendations based on actual content of one of the major Dutch broadsheet newspapers and its users (N=21,973 articles, N=500 users). We find that all of the recommendation logics under study proved to lead to a rather diverse set of recommendations that are on par with human editors and that basing recommendations on user histories can substantially increase topic diversity within a recommendation set.
Helberger, N. Pierson, J. Poell, T.
16-01-2018 The Information Society
Online platforms, from Facebook to Twitter, and from Coursera to Uber, have become deeply involved in a wide range of public activities, including journalism, civic engagement, education, and transport. As such, they have started to play a vital role in the realization of important public values and policy objectives associated with these activities. Based on insights from theories about risk sharing and the problem of many hands, this article develops a conceptual framework for the governance of the public role of platforms, and elaborates on the concept of cooperative responsibility for the realization of critical public policy objectives in Europe. It argues that the realization of public values in platform-based public activities cannot be adequately achieved by allocating responsibility to one central actor (as is currently common practice), but should be the result of dynamic interaction between platforms, users, and public institutions.
Eskens, S. Helberger, N. Möller, J.
07-11-2017; Journal of Media Law, 9 (2) 259-284
More and more news is personalised, based on our personal data and interests. As a result, the focus of media regulation moves from the news producer to the news recipient. This research asks what the fundamental right to receive information means for personalised news consumers and the obligation it imposes on states. However, the right to receive information is under-theorised. Therefore, we develop a framework to understand this right, starting from case law of the European Court of Human Rights. On this basis, we identify five perspectives on the right to receive information: political debate, truth finding, social cohesion, avoidance of censorship and self-development. We evaluate how news personalisation affects the right to receive information, considering these five different perspectives. Our research reveals important policy choices that must be made regarding personalised news considering news consumers’ rights.
D’Acunto, L. Helberger, N. Karppinen, K.
19-01-2017 Information, Communication and Society
Personalized recommendations in search engines, social media and also in more traditional media increasingly raise concerns over potentially negative consequences for diversity and the quality of public discourse. The algorithmic filtering and adaption of online content to personal preferences and interests is often associated with a decrease in the diversity of information to which users are exposed. Notwithstanding the question of whether these claims are correct or not, this article discusses whether and how recommendations can also be designed to stimulate more diverse exposure to information and to break potential ‘filter bubbles’ rather than create them. Combining insights from democratic theory, computer science and law, the article makes suggestions for design principles and explores the potential and possible limits of ‘diversity sensitive design’.
Bodó, B. Eskens, S. Helberger, N. Möller, J. Trilling, D. Vreese, C.H. de Zuiderveen Borgesius, F.
27-10-2016 Computerrecht 5, 255-262
Beleidsmakers, wetenschappers en anderen vrezen dat gepersonaliseerd nieuws kan leiden tot filter bubbles, unieke informatieruimtes voor iedereen. Filter bubbles zouden een gevaar vormen voor onze democratie. Op basis van de politieke voorkeuren van een gebruiker kan een gepersonaliseerde nieuwssite bepaalde onderwerpen of meningen bijvoorbeeld een meer of minder prominente plek geven. Er wordt gedacht dat personalisatie tot een nieuwe vorm van verzuiling kan leiden, waarbij gebruikers van online gepersonaliseerd nieuws weinig verschillende politieke ideeën tegenkomen. In deze bijdrage bespreken we empirisch onderzoek naar de omvang en effecten van personalisatie. Hierbij onderscheiden we zelfgeselecteerde personalisatie, waarbij mensen expliciet aangeven over welke onderwerpen zij informatie willen ontvangen, en vooraf geselecteerde personalisatie, waarbij algoritmes bepalen over welke onderwerpen gebruikers informatie ontvangen. We concluderen dat er tot nu toe weinig empirisch bewijs is dat de zorgen over filter bubbles rechtvaardigt.
Helberger, N. Irion, K. Möller, J. Trilling, D. de Vreese, C.H.
2016-09-27; info, 18(6)
A shared issue agenda provides democracies with a set of topics that structure the public debate. The advent of personalized news media that use smart algorithms to tailor the news offer to the user challenges the established way of setting the agenda of such a common core of issues. This paper tests the effects of personalized news use on perceived importance of these issues in the common core. In particular we study whether personalized news use leads to a concentration at the top of the issue agenda or to a more diverse issue agenda with a long tail of topics. Based on a cross-sectional survey of a representative population sample (N=1556), we find that personalized news use does not lead to a small common core in which few topics are discussed extensively, yet there is a relationship between personalized news use and a preference for less discussed topics. This is a result of a specific user profile of personalized news users: younger, more educated news users are more interested in topics at the fringes of the common core and also make more use of personalized news offers. The results are discussed in the light of media diversity and recent advances in public sphere research.
2016-09-16; LSE blog
Facebook’s approach to allowing, censoring or prioritising content that appears in the news feed has recently been the focus of much attention, both media and governmental. Professor Natali Helberger of the Institute for Information Law at the University of Amsterdam argues that we need to seek to understand the new kind of editorial role that Facebook is playing, in order to know how to tackle the questions it raises.
Helberger, N. Trilling, D.
2016-05-26; LSE blog
Facebook’s use of human editors may bring comfort to some, but there are wider issues to do with editorial responsibility that need to be addressed.
Zuiderveen Borgesius, F. J., Trilling, D., Möller, J., Bodó, B., de Vreese, C. H., & Helberger, N.
2016-04-01; Internet Policy Review, 5(1)
Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.
Helberger, N. Kleinen-von Königslöw, K. van der Noll, R.
2015-09-01; info, 17(6)
2015-05-07; International Journal of Communication, 9(17)
Personalized recommendations provide new opportunities to engage with audiences and influence media choices. Should the public-service media use such algorithmic profiling and targeting to guide audiences and stimulate more diverse choices? And if they do, is this a brave new world we would like to live in? This article outlines new opportunities for the public-service media to fulfill their commitment to media diversity and highlights some of the ethical and normative considerations that will play a role. The article concludes with a call for a new body of “algorithmic media ethics.”
2015-01-13; Het Parool
Interview in the Dutch newspaper Het Parool about the changing position of social media users.
2014-12-02; Financieel Dagblad
Interview in the Dutch newspaper Financieel Dagblad of 29 November 2014 concerning the threats and potential of personalisation.
2014-10-10; inaugural lecture delivered on the occasion of the acceptance of the position of professor of information law, especially concerning the use of information, at the Faculty of Law at the University of Amsterdam on Friday 19 September 2014 (video).
In the digital media environment user attention is scarce and competition for ‘eyeballs’ is fierce. Profiling and targeting users with customized news and advertisements is widely seen as a solution, and part of a larger trend to invest in what the New York Times has called ‘smart new strategies for growing our audience’. The shift from public information intermediary to personal information service creates new dynamics but also new imbalances in the relationship between the media and their users. In my inaugural speech I will state that to restore the balance, the media and regulators in Brussels and The Hague need to develop a vision of how to deal with issues such as media user privacy, editorial integrity and more generally ‘fair algorithmic media practices’.”