Category Archives: regulation

The Weeping Angels are back, and they attack our privacy via smart TVs

weeping-angelsOne fundamental difference between traditional television sets and smart televisions is that the latter are connected to the internet. Like other connected devices, smart TVs make it possible to track what users do online and even offline, and therefore trigger privacy and data protection issues. We recently argued that the issue of media users’ privacy requires special attention from policymakers, not only from the perspective of data protection law, but also from media and communication law and policy. Tracking what people watch online can reveal sensitive insights into individual interests, political leanings, religious beliefs and cultural identity. In our article we challenge the lack of attention in media policy for the specific concerns about viewers’ privacy and data protection at the EU level. The latest revelations about the CIA’s attempts to turn smart TVs into eavesdropping devices (innocuously named Weeping Angel) just underscore how sensitive the issue of media users’ privacy really is, and how badly it needs protection.

Smart TV eavesdropping: a wake up call

It is rather ironic that secret services CIA and MI5 have chosen the name of Weeping Angels from the British series Doctor Who for their joint smart TV spy program. The latest media revelations – drawing on the WikiLeaks leak Vault 7: CIA Hacking Tools Revealed (on 7 March 2017) – alleging that US security services are capable of using smart TVs to eavesdrop on users do not lag behind in creepiness when compared to Angels launching attacks. Of all devices, the CIA has been targeting the TV set, our trusted friend in the living room, to capture audio (even when shut down), extract the Wi-Fi network credentials the TV uses, and other usernames and passwords stored on the TV browser. This incident is yet another wake up call to European policymakers to better protect the security of connected consumer devices and the privacy and right to confidentiality of media users.

The connective capabilities of smart TVs led to public outcries in several European countries for divulging users’ privacy in a variety of ways. In 2013, the media reported that a smart TV was found to transfer information about what users are viewing to the equipment manufacturer. In 2015, the voice control of a smart TV made headlines for incidentally eavesdropping on private conversations. We reviewed a number of implementation and enforcement actions in Germany and the Netherlands. In our analysis we show how users’ agency is being significantly reduced because information duties have not been complied with and how default settings were not privacy preserving. Overzealous data collection via smart TVs is not just a European issue. The US Federal Trade Commission just fined a smart TV provider for recording viewing habits of 11 million customers and selling them to third parties.

Beyond privacy freedom of expression at stake

One of the particularities of the discussion on smart TV privacy is that it is being dealt almost exclusively as an issue of data protection and privacy, and that the debate is completely oblivious to the broader and at least equally worrying implications for freedom of expression and media law and policy. In its 2013 Green Paper on Convergence, for example, the European Commission does acknowledge the fact that “the processing of personal data is often the prerequisite for the functioning of new services, even though the individual is often not fully aware of the collection and processing of personal data”. However, the document makes it very clear that the European Commission believes that these matters should, in the first place, be a matter for EU data protection regulation. We argue, conversely, that the issue of users’ viewing privacy is also a matter for media and communication law and policy, at both the level of the EU and its member states. This is because of the potential impact that tracking users online can have on users’ freedom to inform themselves and exercise their rights to freedom of expression. Privacy in this context is instrumental in furthering our freedom of expression rights, which is why the privacy of media users’ privacy deserves special attention. The tv set is not only the device that sees us lounging in our pyjamas on the couch (in itself reason enough to be worried about privacy). What is more important even, we use TV services to inform ourselves and prepare us for our role as informed citizens. As scholars have convincingly argued, a certain level of intellectual privacy or breathing space is indispensable for our ability to form our ideas and opinions – unmonitored by device producers, advertisers and the CIA.

The Council of Europe noted explicitly in the context of tracking users online that “[t]hese capabilities and practices can have a chilling effect on citizen participation in social, cultural and political life and, in the longer term, could have damaging effects on democracy. … More generally, they can endanger the exercise of freedom of expression and the right to receive and impart information protected under Article 10 of the European Convention on Human Rights”.

German data protection authorities observe that “[t]elevision is a central medium for conveying information and an essential condition for freedom of expression. The right to unobstructed information access is constitutionally protected and a basic prerequisite for the democratic order. The comprehensive collection, processing and using of information about user behaviour will adversely affect the exercise of that right”. The Dutch data protection authority underscores that personal data collected via smart TV is sensitive because it can reveal very individual patterns which could potentially disclose the specific social background, financial or family situation. These initiatives confirm that the user of media services may require a higher, or at least different levels of protection when consuming media content than, for example, when buying shoes online. A talk by Alexander Nix of Cambridge Analytics (listen in at 9’ 22” of the video) shows just how much companies want to tap into the data of people’s viewing behaviour.

So far, only Germany has specific privacy safeguards in its media law (Section 13 and 15 Act on Telemedia). Remarkably, in Germany viewers have the right to use their TV anonymously, insofar as this is technically feasible. German law moreover prohibits the sharing of personal data on the use of television and on-demand audiovisual media services with third parties. Only anonymised usage data can be shared with third parties for market research and analytical purposes. This ties in with literature that argues in favour of protecting privacy in order to preserve the freedom to receive information and hold opinions.

Get it done with the new e-Privacy Regulation

Thankfully, there are several opportunities for the European legislator to show that it has gotten the wake-up call and is moving into action to protect media users. The draft for a revised Audiovisual Media Service Directive does not mention privacy once – this should change, and the European legislator should follow the example of Germany, and introduce a right to read anonymously and protection from unauthorised data sharing at the European level.

Then, there is the new legislative proposal for a Privacy and Electronic Communications Regulation that will replace today’s e-Privacy Directive. Elsewhere we explain how the Privacy and Electronic Communications Regulation could become a suitable vehicle to protect the confidentiality and security of users of interactive televisions and online content services.

The European legislator should use his exclusive competence from Article 16 of the Treaty on the Functioning of the European Union in the area of data protection to introduce provisions that protect the confidentiality and security of media consumption (and not only information in transit) against unauthorised eavesdropping from within and outside the European Union.

Remains to conclude with the note that the Angels in Doctor Who are quantum-looked creatures: they cease to move when observed – were only the intelligence agency that easy to prevent from hijacking our devices.

Source: The Weeping Angels are back, and they attack our privacy via smart TVs | Internet Policy Review

Algorithms we want | n.n. — notes & nodes on society, technology and the space of the possible, by felix stalder

If we now demand that algorithms have to be made better in this applied sense, we only demand that the program that was built into them should work better. But this program is not just harmless efficiency, the definition of the problems and the possible solutions almost always corresponds to a neo-liberal world view. By this I mean three things: First, the society is individualized. Everything is attributed to individual, specifiable persons separated by their differences from one another and action means first and foremost individual action. Second, the individuals so identified are placed in a competing relationship with one another, through all sorts of rankings on which one’s own position relative to that of others can rise or fall. And, third, the group or the collective – which expresses the consciousness of its members as related to one another – is replaced by the aggregate, which is essentially formed without the knowledge of the actors. Either because it is supposed to emerge spontaneously, as Friedrich von Hayek thought it, or because it is constructed behind the backs of the people in the closed depths of the data centers. Visible to a few actors only.

Source: Algorithms we want | n.n. — notes & nodes on society, technology and the space of the possible, by felix stalder

Facebook Must Acknowledge and Change Its Financial Incentives – NYTimes.com

Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm. First steps could include more transparency about the business model driving these changes, incorporating opportunities for comment from members of civil society and the news industry, and creating an internal team dedicated to media ethics concerns, with an explicit mission statement driven by values rather than increasing clicks and views.

Source: Facebook Must Acknowledge and Change Its Financial Incentives – NYTimes.com

How to Hold Algorithms Accountable

lgorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But despite the potential for efficiency gains, algorithms fed by big data can also amplify structural discrimination, produce errors that deny services to individuals, or even seduce an electorate into a false sense of security. Indeed, there is growing awareness that the public should be wary of the societalrisks posed by over-reliance on these systems and

Source: How to Hold Algorithms Accountable

Algorithmic selection on the Internet – Media Change & Innovation – IPMZ – University of Zurich

Publications

  • Just, Natascha / Latzer, Michael (2016): Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet. In: Media, Culture & Society [accepted manuscript, forthcoming online before print]. [pdf]
  • Dörr, Konstantin / Hollnbuchner, Katharina (2016): Ethical Challenges of Algorithmic Journalism. In: Digital Journalism [accepted manuscript; forthcoming online before print]. [pdf]
  • Latzer, Michael / Hollnbuchner, Katharina / Just, Natascha / Saurwein, Florian (2016): The economics of algorithmic selection on the Internet. In: Bauer, J. and Latzer, M. (Eds), Handbook on the Economics of the Internet. Cheltenham, Northampton: Edward Elgar, 395-425. [pdf]
  • Saurwein, Florian / Just, Natascha / Latzer, Michael (2015): Governance of algorithms: options and limitations. In: info, Vol. 17 (6), 35-49. [pdf]
  • Dörr, Konstantin (2015): Mapping the field of Algorithmic Journalism. In: Digital Journalism [online before print]. [pdf]

 

 

Society in the Loop Artificial Intelligence – Joi Ito’s Web

Iyad Rahwan was the first person I heard use the term society-in-the-loop machine learning. He was describing his work which was just published in Science, on polling the public through an online test to find out how they felt about various decisions people would want a self-driving car to make – a modern version of what philosophers call “The Trolley Problem.” The idea was that by understanding the priorities and values of the public, we could train machines to behave in ways that the society would consider ethical. We might also make a system to allow people to interact with the Artificial Intelligence (AI) and test the ethics by asking questions or watching it behave.

Source: Society in the Loop Artificial Intelligence – Joi Ito’s Web

What publishers should know about the new EU data laws

Dominic Perkins, commercial development director at Time Inc., asked if it makes more sense for publishers to create data co-ops, creating a single identifier in order to be more transparent, and sharing that among them.

Probably, answered Yves Schwarzbart, acting head of policy and regulatory affairs at the IAB U.K. “Simplifying the data processes is in everyone’s interests,” he said. “It needs to start with someone.”

Source: The European Commission has just passed a new EU wide data law, great for personal data protection, not so good for publishers

How algorithms rule the world | Science | The Guardian

On 4 August 2005, the police department of Memphis, Tennessee, made so many arrests over a three-hour period that it ran out of vehicles to transport the detainees to jail. Three days later, 1,200 people had been arrested across the city – a new police department record. Operation Blue Crush was hailed a huge success. Larry Godwin, the city’s new police director, quickly rolled out the scheme and by 2011 crime across the city had fallen by 24%. When it was revealed Blue Crush faced budget cuts earlier th

Source: How algorithms rule the world | Science | The Guardian