Category Archives: scientific articles

User perspectives on social media data mining

What do social media users think about social media data mining? This article reports on focus group research in three European countries (the United Kingdom, Norway and Spain). The method created a space in which to make sense of the diverse findings of quantitative studies, which relate to individual differences (such as extent of social media use or awareness of social media data mining) and differences in social media data mining practices themselves (such as the type of data gathered, the purpose for which data are mined and whether transparent information about data mining is available). Moving beyond privacy and surveillance made it possible to identify a concern for fairness as a common trope among users, which informed their varying viewpoints on distinct data mining practices. The authors argue that this concern for fairness can be understood as contextual integrity in practice (Nissenbaum, 2009) and as part of broader concerns about well-being and social justice.

Source: Convergence – Helen Kennedy, Dag Elgesem, Cristina Miguel, 2017

Search and Politics: The Uses and Impacts of Search in Britain, France, Germany, Italy, Poland, Spain, and the United States

What is the role of search in shaping opinion? Survey results indicate that, among others: 1. The filter bubble argument is overstated, as Internet users expose themselves to a variety of opinions and viewpoints online and through a diversity of media. Search needs to be viewed in a context of multiple media. 2. Concerns over echo chambers are also overstated, as Internet users are exposed to diverse viewpoints both online and offline. Most users are not silenced by contrasting views, nor do they silence those who disagree with them. 3. Fake news has attracted disproportionate levels of concern, in light of people’s actual practices. Internet users are generally skeptical of information across all media and know how to check the accuracy and validity of information found through search, on social media, or on the Internet in general.

Source: Search and Politics: The Uses and Impacts of Search in Britain, France, Germany, Italy, Poland, Spain, and the United States by William H. Dutton, Bianca Christin Reisdorf, Elizabeth Dubois, Grant Blank :: SSRN

Mahnke Skrubbeltrang,  Grunnet, & Traasdahl Tarp: #RIPINSTAGRAM: Examining user’s counter-narratives opposing the introduction of algorithmic personalization on Instagram

When Instagram announced the implementation of algorithmic personalization on their platform a heated debate arose. Several users expressed instantly their strong discontent under the hashtag #RIPINSTAGRAM. In this paper, we examine how users commented on the announcement of Instagram implementing algorithmic personalization. Drawing on the conceptual starting point of framing user comments as “counter-narratives” (Andrews, 2004), which oppose Instagram’s organizational narrative of improving the user experience, the study explores the main concerns users bring forth in greater detail. The two-step analysis draws on altogether 8,645 comments collected from Twitter and Instagram. The collected Twitter data were used to develop preliminary inductive categories describing users’ counter-narratives. Thereafter, we coded all Instagram data extracted from Instagram systematically in order to enhance, adjust and revise the preliminary categories. This inductive coding approach (Mayring, 2000) combined with an in-depth qualitative analysis resulted in the identification of the following four counter-narratives brought forth by users: 1) algorithmic hegemony; 2) violation of user autonomy; 3) prevalence of commercial interests; and 4) deification of mainstream. All of these counter-narratives are related to ongoing public debates regarding the social implications of algorithmic personalization. In conclusion, the paper suggests that the identified counter-narratives tell a story of resistance. While technological advancement is generally welcomed and celebrated, the findings of this study point towards a growing user resistance to algorithmic personalization.

[link to full text]

The Social Power of Algorithms

The Social Power of Algorithms

Introduction
Introduction
Pages: 1-13
Published online: 08 Aug 2016
Articles
Article
Pages: 137-150
Published online: 20 Jun 2016

Helping Computers Explain Their Reasoning

Edgar Meij, a senior data scientist on Bloomberg’s news search experience team, is working to explain the logic behind related entities in search results.

One of the first steps toward that goal is to design an algorithm that can explain the relationship between two terms – called entities – in plain English. In a paper together with two researchers from the University of Amsterdam, Prof. Dr. Maarten de Rijke and Nikos Voskarides, he presents a methodology to do just that.

Source: Helping Computers Explain Their Reasoning: New Research by Edgar Meij | Tech at Bloomberg

Virtual Competition – The Promise and Perils of the Algorithm-Driven Economy

Shoppers with Internet access and a bargain-hunting impulse can find a universe of products at their fingertips. In this thought-provoking exposé, Ariel Ezrachi and Maurice Stuckeinvite us to take a harder look at today’s app-assisted paradise of digital shopping. While consumers reap many benefits from online purchasing, the sophisticated algorithms and data-crunching that make browsing so convenient are also changing the nature of market competition, and not always for the better.

Computers colluding is one danger. Although long-standing laws prevent companies from fixing prices, data-driven algorithms can now quickly monitor competitors’ prices and adjust their own prices accordingly. So what is seemingly beneficial—increased price transparency—ironically can end up harming consumers. A second danger is behavioral discrimination. Here, companies track and profile consumers to get them to buy goods at the highest price they are willing to pay. The rise of super-platforms and their “frenemy” relationship with independent app developers raises a third danger. By controlling key platforms (such as the operating system of smartphones), data-driven monopolies dictate the flow of personal data and determine who gets to exploit potential buyers.

Virtual Competition raises timely questions. To what extent does the “invisible hand” still hold sway? In markets continually manipulated by bots and algorithms, is competitive pricing an illusion? Can our current laws protect consumers? The changing market reality is already shifting power into the hands of the few. Ezrachi and Stucke explore the resulting risks to competition, our democratic ideals, and our economic and overall well-being.

.

The Weeping Angels are back, and they attack our privacy via smart TVs

weeping-angelsOne fundamental difference between traditional television sets and smart televisions is that the latter are connected to the internet. Like other connected devices, smart TVs make it possible to track what users do online and even offline, and therefore trigger privacy and data protection issues. We recently argued that the issue of media users’ privacy requires special attention from policymakers, not only from the perspective of data protection law, but also from media and communication law and policy. Tracking what people watch online can reveal sensitive insights into individual interests, political leanings, religious beliefs and cultural identity. In our article we challenge the lack of attention in media policy for the specific concerns about viewers’ privacy and data protection at the EU level. The latest revelations about the CIA’s attempts to turn smart TVs into eavesdropping devices (innocuously named Weeping Angel) just underscore how sensitive the issue of media users’ privacy really is, and how badly it needs protection.

Smart TV eavesdropping: a wake up call

It is rather ironic that secret services CIA and MI5 have chosen the name of Weeping Angels from the British series Doctor Who for their joint smart TV spy program. The latest media revelations – drawing on the WikiLeaks leak Vault 7: CIA Hacking Tools Revealed (on 7 March 2017) – alleging that US security services are capable of using smart TVs to eavesdrop on users do not lag behind in creepiness when compared to Angels launching attacks. Of all devices, the CIA has been targeting the TV set, our trusted friend in the living room, to capture audio (even when shut down), extract the Wi-Fi network credentials the TV uses, and other usernames and passwords stored on the TV browser. This incident is yet another wake up call to European policymakers to better protect the security of connected consumer devices and the privacy and right to confidentiality of media users.

The connective capabilities of smart TVs led to public outcries in several European countries for divulging users’ privacy in a variety of ways. In 2013, the media reported that a smart TV was found to transfer information about what users are viewing to the equipment manufacturer. In 2015, the voice control of a smart TV made headlines for incidentally eavesdropping on private conversations. We reviewed a number of implementation and enforcement actions in Germany and the Netherlands. In our analysis we show how users’ agency is being significantly reduced because information duties have not been complied with and how default settings were not privacy preserving. Overzealous data collection via smart TVs is not just a European issue. The US Federal Trade Commission just fined a smart TV provider for recording viewing habits of 11 million customers and selling them to third parties.

Beyond privacy freedom of expression at stake

One of the particularities of the discussion on smart TV privacy is that it is being dealt almost exclusively as an issue of data protection and privacy, and that the debate is completely oblivious to the broader and at least equally worrying implications for freedom of expression and media law and policy. In its 2013 Green Paper on Convergence, for example, the European Commission does acknowledge the fact that “the processing of personal data is often the prerequisite for the functioning of new services, even though the individual is often not fully aware of the collection and processing of personal data”. However, the document makes it very clear that the European Commission believes that these matters should, in the first place, be a matter for EU data protection regulation. We argue, conversely, that the issue of users’ viewing privacy is also a matter for media and communication law and policy, at both the level of the EU and its member states. This is because of the potential impact that tracking users online can have on users’ freedom to inform themselves and exercise their rights to freedom of expression. Privacy in this context is instrumental in furthering our freedom of expression rights, which is why the privacy of media users’ privacy deserves special attention. The tv set is not only the device that sees us lounging in our pyjamas on the couch (in itself reason enough to be worried about privacy). What is more important even, we use TV services to inform ourselves and prepare us for our role as informed citizens. As scholars have convincingly argued, a certain level of intellectual privacy or breathing space is indispensable for our ability to form our ideas and opinions – unmonitored by device producers, advertisers and the CIA.

The Council of Europe noted explicitly in the context of tracking users online that “[t]hese capabilities and practices can have a chilling effect on citizen participation in social, cultural and political life and, in the longer term, could have damaging effects on democracy. … More generally, they can endanger the exercise of freedom of expression and the right to receive and impart information protected under Article 10 of the European Convention on Human Rights”.

German data protection authorities observe that “[t]elevision is a central medium for conveying information and an essential condition for freedom of expression. The right to unobstructed information access is constitutionally protected and a basic prerequisite for the democratic order. The comprehensive collection, processing and using of information about user behaviour will adversely affect the exercise of that right”. The Dutch data protection authority underscores that personal data collected via smart TV is sensitive because it can reveal very individual patterns which could potentially disclose the specific social background, financial or family situation. These initiatives confirm that the user of media services may require a higher, or at least different levels of protection when consuming media content than, for example, when buying shoes online. A talk by Alexander Nix of Cambridge Analytics (listen in at 9’ 22” of the video) shows just how much companies want to tap into the data of people’s viewing behaviour.

So far, only Germany has specific privacy safeguards in its media law (Section 13 and 15 Act on Telemedia). Remarkably, in Germany viewers have the right to use their TV anonymously, insofar as this is technically feasible. German law moreover prohibits the sharing of personal data on the use of television and on-demand audiovisual media services with third parties. Only anonymised usage data can be shared with third parties for market research and analytical purposes. This ties in with literature that argues in favour of protecting privacy in order to preserve the freedom to receive information and hold opinions.

Get it done with the new e-Privacy Regulation

Thankfully, there are several opportunities for the European legislator to show that it has gotten the wake-up call and is moving into action to protect media users. The draft for a revised Audiovisual Media Service Directive does not mention privacy once – this should change, and the European legislator should follow the example of Germany, and introduce a right to read anonymously and protection from unauthorised data sharing at the European level.

Then, there is the new legislative proposal for a Privacy and Electronic Communications Regulation that will replace today’s e-Privacy Directive. Elsewhere we explain how the Privacy and Electronic Communications Regulation could become a suitable vehicle to protect the confidentiality and security of users of interactive televisions and online content services.

The European legislator should use his exclusive competence from Article 16 of the Treaty on the Functioning of the European Union in the area of data protection to introduce provisions that protect the confidentiality and security of media consumption (and not only information in transit) against unauthorised eavesdropping from within and outside the European Union.

Remains to conclude with the note that the Angels in Doctor Who are quantum-looked creatures: they cease to move when observed – were only the intelligence agency that easy to prevent from hijacking our devices.

Source: The Weeping Angels are back, and they attack our privacy via smart TVs | Internet Policy Review

Breaking the filter bubble: democracy and design

It has been argued that the Internet and social media increase the number of available viewpoints, perspectives, ideas and opinions available, leading to a very diverse pool of information. However, critics have argued that algorithms used by search engines, social networking platforms and other large online intermediaries actually decrease information diversity by forming so-called “filter bubbles”. This may form a serious threat to our democracies. In response to this threat others have developed algorithms and digital tools to combat filter bubbles. This paper first provides examples of different software designs that try to break filter bubbles. Secondly, we show how norms required by two democracy models dominate the tools that are developed to fight the filter bubbles, while norms of other models are completely missing in the tools. The paper in conclusion argues that democracy itself is a contested concept and points to a variety of norms. Designers of diversity enhancing tools must thus be exposed to diverse conceptions of democracy.

Source: Breaking the filter bubble: democracy and design

Attack discrimination with smarter machine learning

If the goal is for the two groups to receive the same number of loans, then a natural criterion is demographic parity, where the bank uses loan thresholds that yield the same fraction of loans to each group. Or, as a computer scientist might put it, the “positive rate” is the same across both groups.In some contexts, this might be the right goal. In the situation in the diagram, though, there’s still something problematic: a demographic parity constraint only looks at loans given, not rates at which loa

Source: Attack discrimination with smarter machine learning