Personalised news

Personalised News

On the web, news media are profiling and targeting news users in order to serve them with news stories and advertisements that match their individual interests. For example, the New York Times has a section on its homepage titled  “Recommended for you”. The section contains an automatically created selection of articles that are presumed to be of particular relevance for you. European news organisations, like the BBC in the UK and de Volkskrant in the Netherlands, as well as newer players like Facebook, are experimenting with personalization as well.

News personalization can allow media companies to better serve their users by, for example, helping them deal with information overload or by serving them more interesting content. Monetizing the resulting increase in attention from users can also have financial benefits for the industry. However, the increasing personalization of news also raises concerns about the interests of media users, the role of the media in society, and the media’s relationship with news readers. For example, what are the users’ concerns with regard to personalization, and under which conditions are they most likely to accept it? Who controls the algorithm which is in charge of what users see? What rights do users have in this process, and how does the introduction of personalization interact with traditional media values? And how can the trust of the public be safeguarded in light of these changes?

In order to answer such questions, the personalized news project combines legal and empirical research. The project will provide its answers in three stages. Firstly, the postdocs explore how personalization affects the relationship between media users and the media. Informed by these findings, two PhD candidates investigate the legal rules on users’ rights of freedom of expression and privacy, as well as editorial responsibility in the context of personalised news. Finally, the principal investigator will develop a new normative framework through which news personalization can be assessed, exploring the extent to which the government could stimulate new media strategies for personalised, public interest content.


The team

The research team investigates personalization from the perspective of both the media and the users and consists of one professor of information law, three postdoctoral researchers, and 2 legal PhD students. More information can be found on their profile pages, which are linked below, or on the personalized communications team page.

prof. dr. N. Helberger

Natali Helberger is Professor of Information Law, with a special focus on the use of information, at the Institute for Information Law (IViR). She leads the personalized news project and will develop a normative framework of personalised media based on the empirical research and political theories about the democratic role of the media.

dr. B. Bodó

Balázs Bodó is a socio-legal researcher at the Institute for Information Law (IViR). In his research for the personalized news project he explores explores the mechanisms behind and the attitudes, concerns and economic/ethical consideration for the news media, and how these translate into journalistic practices, editorial guidelines and emerging algorithmic journalistic ethics.

dr. J. Möller

Judith Möller is a postdoctoral researcher at the Amsterdam School of Communication Research (ASCoR). Her research in the personalized news project focuses on user attitudes towards implicit media personalization as well as the conditions that affect users’ acceptance of media personalization.

dr. J.L.D. Neys

Joyce Neys is a postdoctoral researcher at Institute for Information Law (IViR). In her research she mainly focuses on the perspective of the user and investigates the conditions that influence the social acceptability and rejection of tracking and targeting in order to contribute to a further (re)conceptualization of the democratic contribution of news media in contemporary society from a user perspective. By exploring the different practices that different users adopt in the context of news personalisation, she aims to shed light on how these strategies inform our understanding of what it means to be a citizen in contemporary democratic society.

S.J. Eskens

Sarah Eskens is a PhD candidate at the Institute for Information Law (IViR). She researches the legal consequences of news personalization from the perspective of the users, with a focus on the way in which rules of privacy and data protection law can protect their right to receive information.

M.Z. van Drunen

Max van Drunen is a PhD candidate at the Institute for Information Law (IViR). His research explores how personalization impacts editorial responsibility in the news industry, and how this concept can legally be furthered and protected in the context of personalisation.



Shrinking core? Exploring the differential agenda setting power of traditional and personalized news

Helberger, N. Irion, K. Möller, J. Trilling, D. de Vreese, C.H.

2016-09-27; info, 18(6)

A shared issue agenda provides democracies with a set of topics that structure the public debate. The advent of personalized news media that use smart algorithms to tailor the news offer to the user challenges the established way of setting the agenda of such a common core of issues. This paper tests the effects of personalized news use on perceived importance of these issues in the common core. In particular we study whether personalized news use leads to a concentration at the top of the issue agenda or to a more diverse issue agenda with a long tail of topics. Based on a cross-sectional survey of a representative population sample (N=1556), we find that personalized news use does not lead to a small common core in which few topics are discussed extensively, yet there is a relationship between personalized news use and a preference for less discussed topics. This is a result of a specific user profile of personalized news users: younger, more educated news users are more interested in topics at the fringes of the common core and also make more use of personalized news offers. The results are discussed in the light of media diversity and recent advances in public sphere research.


Facebook is a new breed of editor: a social editor

Helberger, N.

2016-09-16; LSE blog

Facebook’s approach to allowing, censoring or prioritising content that appears in the news feed has recently been the focus of much attention, both media and governmental. Professor Natali Helberger of the Institute for Information Law at the University of Amsterdam argues that we need to seek to understand the new kind of editorial role that Facebook is playing, in order to know how to tackle the questions it raises.


Facebook is a news editor: the real issues to be concerned about

Helberger, N. Trilling, D.

2016-05-26; LSE blog

Facebook’s use of human editors may bring comfort to some, but there are wider issues to do with editorial responsibility that need to be addressed.


Should we worry about filter bubbles?

Bodó, B. Helberger, N. Möller, J.Trilling, D. de Vreese, C.H. Zuiderveen Borgesius, F.

2016-04-01; Internet Policy Review, 5(1)

Some fear that personalised communication can lead to information cocoons or filter bubbles. For instance, a personalised news website could give more prominence to conservative or liberal media items, based on the (assumed) political interests of the user. As a result, users may encounter only a limited range of political ideas. We synthesise empirical research on the extent and effects of self-selected personalisation, where people actively choose which content they receive, and pre-selected personalisation, where algorithms personalise content for users without any deliberate user choice. We conclude that at present there is little empirical evidence that warrants any worries about filter bubbles.


Regulating the new information intermediaries as gatekeepers of information diversity

Helberger, N. Kleinen-von Königslöw, K. van der Noll, R.

2015-09-01; info, 17(6)


Merely Facilitating or Actively Stimulating Diverse Media Choices? Public Service Media at the Crossroad

Helberger, N.

2015-05-07; International Journal of Communication, 9(17)

Personalized recommendations provide new opportunities to engage with audiences and influence media choices. Should the public-service media use such algorithmic profiling and targeting to guide audiences and stimulate more diverse choices? And if they do, is this a brave new world we would like to live in? This article outlines new opportunities for the public-service media to fulfill their commitment to media diversity and highlights some of the ethical and normative considerations that will play a role. The article concludes with a call for a new body of “algorithmic media ethics.”


Nieuws à la carte

Helberger, N.

2015-01-13; Het Parool

Interview in the Dutch newspaper Het Parool about the changing position of social media users.


‘Personaliseren sites leidt tot manipulatie’

Helberger, N.

2014-12-02; Financieel Dagblad

Interview in the Dutch newspaper Financieel Dagblad of 29 November 2014 concerning the threats and potential of personalisation.


Media, users and algorithms: towards a new balance

Helberger, N.

2014-10-10; inaugural lecture delivered on the occasion of the acceptance of the position of professor of information law, especially concerning the use of information, at the Faculty of Law at the University of Amsterdam on Friday 19 September 2014 (video).

In the digital media environment user attention is scarce and competition for ‘eyeballs’ is fierce. Profiling and targeting users with customized news and advertisements is widely seen as a solution, and part of a larger trend to invest in what the New York Times has called ‘smart new strategies for growing our audience’. The shift from public information intermediary to personal information service creates new dynamics but also new imbalances in the relationship between the media and their users. In my inaugural speech I will state that to restore the balance, the media and regulators in Brussels and The Hague need to develop a vision of how to deal with issues such as media user privacy, editorial integrity and more generally ‘fair algorithmic media practices’.”