Claes featured in a 25 min interview on The Open Mind on US public television, speaking about the current information crisis and populism in an international perspective. See the episode here.
Affiliate member Magdalena Wojcieszak got awarded a prestigious ERC grant:
Europeans Exposed to Dissimilar Views in the Media: Investigating Backfire Effects
In many countries, hostility, distrust, and intolerance are on the rise. In this context, scholars claim that encountering dissimilar arguments fosters tolerance, and policymakers promote exposure to different views in the media. Yet, these efforts can make people more extreme and more hostile toward the other side. Magdalena Wojcieszak’s project will use online behaviour tracking, automated content analyses, panel surveys, qualitative work, and experiments in four countries to address a fundamental question: Under which conditions exactly does exposure to dissimilar views in the media amplify or attenuate hostilities among citizens with different opinions? The results will offer insights for scholars, policymakers and practitioners working on media diversity and social cohesion.
Personalised Communication will contribute to the NWA Big Data Route with a project called “FairNews: Nieuwsvoorziening in een Big Data Data tijdperk (Fair News:News in a Big Data Age)”. Under the lead of Claes de Vreese (UvA/CW), Claudia Hauff (TUD), and Joris Hoboken (UvA/IvIR) and in collaboration with Dutch quality newspaper de Volkskrant, the project will deal with the question: How far can and may algorithms go in filtering information?
More info here.
Damian and Judith gave a talk at Bessensap, an annual event organized by the Dutch Science Foundation, where scholars present their work to science journalists. In their talk, they gave an overview over their finished and ongoing research, highlighting the discrepancy between popular beliefs about around filter bubbles and algorithmic news recommendation and empirical evidence.
Together with colleagues from IViR, Frederik, published a new study, “An assessment of the Commission’s Proposal on Privacy and Electronic Communications” (by Frederik Zuiderveen Borgesius, Joris van Hoboken, Ronan Fahy, Kristina Irion, and Max Rozendaal).
This study, commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the LIBE Committee, appraises the European Commission’s proposal for an ePrivacy Regulation. The study assesses whether the proposal would ensure that the right to the protection of personal data, the right to respect for private life and communications, and related rights enjoy a high standard of protection. The study also highlights the proposal’s potential benefits and drawbacks more generally. The proposed ePrivacy Regulation aims to protect privacy on the internet, and includes rules on, for instance, online tracking.
Read the full study here.
Frederik presented the preliminary results at the European Parliament. A video is available here (from 26:00 minutes on).
Earlier this year, Frederik also spoke at the European Parliament about the ePrivacy proposal, video here.
Damian wrote a blogpost in which he comments on the current debate on the role of algorithms. He argues that – while there are valid reasons to be concerned – critics who see algoritms as evil or scary per se, miss the point and hinder a constructive debate.
On Tuesday, Damian gave a lecture “Big Data: Why social scientists should care” at the Amsterdam Research Initiative, discussing the role of Big Data in society as well as in research. He argued that social scientists on the one hand have to observe the role of so-called Big Data as a societal phenomenon, but on the other hand also can make use of these techniques to answer social-scientific research questions. Directly before, he had given a two-day workshop on the use of Python to answer social-scientific research questions at Radboud University Nijmegen.
When Instagram announced the implementation of algorithmic personalization on their platform a heated debate arose. Several users expressed instantly their strong discontent under the hashtag #RIPINSTAGRAM. In this paper, we examine how users commented on the announcement of Instagram implementing algorithmic personalization. Drawing on the conceptual starting point of framing user comments as “counter-narratives” (Andrews, 2004), which oppose Instagram’s organizational narrative of improving the user experience, the study explores the main concerns users bring forth in greater detail. The two-step analysis draws on altogether 8,645 comments collected from Twitter and Instagram. The collected Twitter data were used to develop preliminary inductive categories describing users’ counter-narratives. Thereafter, we coded all Instagram data extracted from Instagram systematically in order to enhance, adjust and revise the preliminary categories. This inductive coding approach (Mayring, 2000) combined with an in-depth qualitative analysis resulted in the identification of the following four counter-narratives brought forth by users: 1) algorithmic hegemony; 2) violation of user autonomy; 3) prevalence of commercial interests; and 4) deification of mainstream. All of these counter-narratives are related to ongoing public debates regarding the social implications of algorithmic personalization. In conclusion, the paper suggests that the identified counter-narratives tell a story of resistance. While technological advancement is generally welcomed and celebrated, the findings of this study point towards a growing user resistance to algorithmic personalization.
Journalist Maurits Martijn wrote a piece on Cambridge Analytica, the online political microtargeting company. The company essentially applies behavioural targeting marketing techniques to political campaigns. Martijn questions whether Cambridge Analytica is really that powerful, and whether it really caused Trump to win the US elections. Both Claes de Vreese and Frederik Zuiderveen Borgesius are quoted in the piece.
The Court of Justice of the European Union decided an important case on 21 December 2016. In short, the Court prohibits mass metadata surveillance. The Court says that EU member states are not allowed to impose an obligation on telecommunications companies to store metadata of all telecom-users.
The Court says that mass metadata surveillance, even if it may help to catch criminals or terrorists, violates people’s privacy and data protection rights. The Court adds that metadata are just as sensitive as the content of communications. Metadata show, for instance, who you call and when. The Court says that such metadata are “no less sensitive, having regard to the right to privacy, than the actual content of communications.”
Frederik commented on the case for Dutch media:
- NOS: http://nos.nl/artikel/2149484-eu-hof-locatie-en-telecomdata-mogen-niet-zomaar-worden-verzameld.html
- Tweakers: https://tweakers.net/nieuws/119273/eu-hof-massaal-verzamelen-telecomgegevens-is-niet-toegestaan.html
- BNR Nieuwsradio (around 37 minutes): https://www.bnr.nl/player/audio/10053540/10315699
The full judgment by the Court of Justice of the European Union is here: http://curia.europa.eu/juris/document/document.jsf?text=&docid=186492&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=566657
For an analysis of metadata surveillance and human rights, see:
Frederik J. Zuiderveen Borgesius and Axel Arnbak, New Data Security Requirements and the Proceduralization of Mass Surveillance Law after the European Data Retention Case, Amsterdam Law School Research Paper No. 2015-41. https://ssrn.com/abstract=2678860