On Tuesday, Damian gave a lecture “Big Data: Why social scientists should care” at the Amsterdam Research Initiative, discussing the role of Big Data in society as well as in research. He argued that social scientists on the one hand have to observe the role of so-called Big Data as a societal phenomenon, but on the other hand also can make use of these techniques to answer social-scientific research questions. Directly before, he had given a two-day workshop on the use of Python to answer social-scientific research questions at Radboud University Nijmegen.
When Instagram announced the implementation of algorithmic personalization on their platform a heated debate arose. Several users expressed instantly their strong discontent under the hashtag #RIPINSTAGRAM. In this paper, we examine how users commented on the announcement of Instagram implementing algorithmic personalization. Drawing on the conceptual starting point of framing user comments as “counter-narratives” (Andrews, 2004), which oppose Instagram’s organizational narrative of improving the user experience, the study explores the main concerns users bring forth in greater detail. The two-step analysis draws on altogether 8,645 comments collected from Twitter and Instagram. The collected Twitter data were used to develop preliminary inductive categories describing users’ counter-narratives. Thereafter, we coded all Instagram data extracted from Instagram systematically in order to enhance, adjust and revise the preliminary categories. This inductive coding approach (Mayring, 2000) combined with an in-depth qualitative analysis resulted in the identification of the following four counter-narratives brought forth by users: 1) algorithmic hegemony; 2) violation of user autonomy; 3) prevalence of commercial interests; and 4) deification of mainstream. All of these counter-narratives are related to ongoing public debates regarding the social implications of algorithmic personalization. In conclusion, the paper suggests that the identified counter-narratives tell a story of resistance. While technological advancement is generally welcomed and celebrated, the findings of this study point towards a growing user resistance to algorithmic personalization.
Journalist Maurits Martijn wrote a piece on Cambridge Analytica, the online political microtargeting company. The company essentially applies behavioural targeting marketing techniques to political campaigns. Martijn questions whether Cambridge Analytica is really that powerful, and whether it really caused Trump to win the US elections. Both Claes de Vreese and Frederik Zuiderveen Borgesius are quoted in the piece.
The Court of Justice of the European Union decided an important case on 21 December 2016. In short, the Court prohibits mass metadata surveillance. The Court says that EU member states are not allowed to impose an obligation on telecommunications companies to store metadata of all telecom-users.
The Court says that mass metadata surveillance, even if it may help to catch criminals or terrorists, violates people’s privacy and data protection rights. The Court adds that metadata are just as sensitive as the content of communications. Metadata show, for instance, who you call and when. The Court says that such metadata are “no less sensitive, having regard to the right to privacy, than the actual content of communications.”
Frederik commented on the case for Dutch media:
- NOS: http://nos.nl/artikel/2149484-eu-hof-locatie-en-telecomdata-mogen-niet-zomaar-worden-verzameld.html
- Tweakers: https://tweakers.net/nieuws/119273/eu-hof-massaal-verzamelen-telecomgegevens-is-niet-toegestaan.html
- BNR Nieuwsradio (around 37 minutes): https://www.bnr.nl/player/audio/10053540/10315699
The full judgment by the Court of Justice of the European Union is here: http://curia.europa.eu/juris/document/document.jsf?text=&docid=186492&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=566657
For an analysis of metadata surveillance and human rights, see:
Frederik J. Zuiderveen Borgesius and Axel Arnbak, New Data Security Requirements and the Proceduralization of Mass Surveillance Law after the European Data Retention Case, Amsterdam Law School Research Paper No. 2015-41. https://ssrn.com/abstract=2678860
This week, there were a lot of media appearences related to the discussion of the role of filter bubbles and fake news in electin campaigns. Damian is quoted in de Volkskrant interviewed for television broadcast EenVandaag (around minute 27.00). Sanne was interviewed the NOS. And we published a piece on Stuk rood vlees.
On 9 November Frederik Zuiderveen Borgesius participates in the Privacy Platform in Brussels (‘Automated profiling after the GDPR, more regulation needed?’), organized by Sophie in ‘t Veld. The event will address the topic of the potential risks of automated profiling, on whether the provisions on profiling in the General Data Protection Regulation will suffice, and on what can be expected of the revision of the ePrivacy Directive.
Chair: Ms Sophie in ‘t Veld MEP
- Frederik Borgesius: Researcher at the Institute for Information Law (IViR) at the University of Amsterdam
- Jeremy Rollison: Director EU Government Affairs at Microsoft
- Sachiko Scheuing: Co-chair of FEDMA, Federation European Direct Advertisement Associations
- Tal Zarsky: Professor at the University of Haifa
More info: http://www.sophieintveld.eu/agenda#
This weekend, there was a lot of media attention on Filter Bubbles. On Friday, quality newspaper Trouw published a two-page story based on interviews with Judith, Damian, and Frederik. The piece reflected some outcomes of our research project, mainly that filter bubbles – at least in the Netherlands, at this point in time – are less of a problem than often assumed.
On Saturday, Damian was interviewed on Radio 1 (Argos) about the same topic, as well as about political microtargeting. You can listen to the fragment here.
Coincidentally, and not related to our project, also quality newspaper de Volkskrant published a large story on Filter Bubbles related to music. It discussed the relationship between the usage of Spotify and music taste, and also hinted at the need for diversity in a music recommendation algorithm, to prevent it from becoming ‘boring’.
Frederik is quoted in an article on online tracking, on the news site Apache from Belgium: “Dit gebeurt er met jouw data op de grootste Belgische websites”.
- Frederik is quoted, in Dutch, in this article on a recent judgment of the Court of Justice of the European Union, in the Breyer case. The Court decided, in short, that and IP address is typically personal data. If information is regarded as personal data, European data privacy law applies. If data privacy law applies, companies must ensure, for instance, adequate data security. See in detail on the scope of the legal personal data definition: Frederik’s paper ‘Singling Out People Without Knowing Their Names – Behavioural Targeting, Pseudonymous Data, and the New Data Protection Regulation’, https://ssrn.com/abstract=2733115
- On 25 October, Frederik speaks at the ‘Big Data debate’, organised by the Dutch Association of Insurers. Frederik will discuss the risks of unfair of even illegal discrimination that result from big data. More info here.
On Thursday 6 October, Frederik speaks in a panel on Facebook, during the Big Brother Awards Belgium.
Joe McNamee (executive director of EDRi European Digital Rights) moderates the panel. Speakers are: Stephen Deadman (Facebook), Matthias Matthiesen (IAB Interactive Advertising Bureau), Brendan Van Alsenoy (Data Protection Authority Belgium), Estelle Massé (AccessNow), and Frederik.
See for more information: https://bigbrotherawards.be/en