Category Archives: regulation

The Decentralized Web

The Web is a key space for civic debate and the current battleground for protecting freedom of expression. However, since its development, the Web has steadily evolved into an ecosystem of large, corporate-controlled mega-platforms which intermediate speech online. In many ways this has been a positive development; these platforms improved usability and enabled billions of people to publish and discover content without having to become experts on the Web’s intricate protocols.But in other ways this development is alarming. Just a few large platforms drive most traffic to online news sources in the U.S., and thus have enormous influence over what sources of information the public consumes on a daily basis. The existence of these consolidated points of control is troubling for many reasons. A small number of stakeholders end up having outsized influence over the content the public can create and consume. This leads to problems ranging from censorship at the behest of national governments to more subtle, perhaps even unintentional, bias in the curation of content users see based on opaque, unaudited curation algorithms. The platforms that host our networked public sphere and inform us about the world are unelected, unaccountable, and often impossible to audit or oversee.

At the same time, there is growing excitement around the area of decentralized systems, which have grown in prominence over the past decade thanks to the popularity of the cryptocurrency Bitcoin. Bitcoin is a payment system that has no central points of control, and uses a novel peer-to-peer network protocol to agree on a distributed ledger of transactions, the blockchain. Bitcoin paints a picture of a world where untrusted networks of computers can coordinate to provide important infrastructure, like verifiable identity and distributed storage. Advocates of these decentralized systems propose related technology as the way forward to “re-decentralize” the Web, by shifting publishing and discovery out of the hands of a few corporations, and back into the hands of users. These types of code-based, structural interventions are appealing because in theory, they are less corruptible and resistant to corporate or political regulation. Surprisingly, low-level, decentralized systems don’t necessarily translate into decreased market consolidation around user-facing mega-platforms.In this report, we explore two important ways structurally decentralized systems could help address the risks of mega-platform consolidation: First, these systems can help users directly publish and discover content directly, without intermediaries, and thus without censorship. All of the systems we evaluate advertise censorship-resistance as a major benefit. Second, these systems could indirectly enable greater competition and user choice, by lowering the barrier to entry for new platforms. As it stands, it is difficult for users to switch between platforms (they must recreate all their data when moving to a new service) and most mega-platforms do not interoperate, so switching means leaving behind your social network. Some systems we evaluate directly address the issues of data portability and interoperability in an effort to support greater competition.We offer case studies of the following decentralized publishing projects:Freedom Box, a system for personal publishingDiaspora, a federated social networkMastodon, a federated Twitter-like serviceBlockstack, a distributed system for online identity servicesIPFS (Interplanetary File System), a distributed storage service with a proposed mechanism to incentivize resource sharingSolid (Social Linked Data), a linked-data protocol that could act as a back-end for data sharing between social media networksAppcoins, a digital currency framework that enables users to financially participate in ownership of platforms and protocolsSteemit, an online community that uses an appcoin to incentivize development and community participation in a social networkConsidering these projects as a whole, we found a robust and fertile community of experimenters developing promising software. Many of the projects in this report are working on deeply exciting new ideas. Easy to use, peer-to-peer distributed storage systems change the landscape for content censorship and archiving. Appcoins may transform how new projects are launched online, making it possible to fund open-source development teams focused on developing shared protocols instead of independent companies. There is also a renewed interest in creating interoperable standards and protocols that can cross platforms.However, we have reason to doubt that these decentralized systems alone will address the problems of exclusion and bias caused by today’s mega-platforms. For example, distributed, censorship-resistant storage does not help address problems related to bias in curation algorithms – content that doesn’t appear at the top of your feed might as well be invisible, even if it’s technically

Source: The Decentralized Web

The Weeping Angels are back, and they attack our privacy via smart TVs

weeping-angelsOne fundamental difference between traditional television sets and smart televisions is that the latter are connected to the internet. Like other connected devices, smart TVs make it possible to track what users do online and even offline, and therefore trigger privacy and data protection issues. We recently argued that the issue of media users’ privacy requires special attention from policymakers, not only from the perspective of data protection law, but also from media and communication law and policy. Tracking what people watch online can reveal sensitive insights into individual interests, political leanings, religious beliefs and cultural identity. In our article we challenge the lack of attention in media policy for the specific concerns about viewers’ privacy and data protection at the EU level. The latest revelations about the CIA’s attempts to turn smart TVs into eavesdropping devices (innocuously named Weeping Angel) just underscore how sensitive the issue of media users’ privacy really is, and how badly it needs protection.

Smart TV eavesdropping: a wake up call

It is rather ironic that secret services CIA and MI5 have chosen the name of Weeping Angels from the British series Doctor Who for their joint smart TV spy program. The latest media revelations – drawing on the WikiLeaks leak Vault 7: CIA Hacking Tools Revealed (on 7 March 2017) – alleging that US security services are capable of using smart TVs to eavesdrop on users do not lag behind in creepiness when compared to Angels launching attacks. Of all devices, the CIA has been targeting the TV set, our trusted friend in the living room, to capture audio (even when shut down), extract the Wi-Fi network credentials the TV uses, and other usernames and passwords stored on the TV browser. This incident is yet another wake up call to European policymakers to better protect the security of connected consumer devices and the privacy and right to confidentiality of media users.

The connective capabilities of smart TVs led to public outcries in several European countries for divulging users’ privacy in a variety of ways. In 2013, the media reported that a smart TV was found to transfer information about what users are viewing to the equipment manufacturer. In 2015, the voice control of a smart TV made headlines for incidentally eavesdropping on private conversations. We reviewed a number of implementation and enforcement actions in Germany and the Netherlands. In our analysis we show how users’ agency is being significantly reduced because information duties have not been complied with and how default settings were not privacy preserving. Overzealous data collection via smart TVs is not just a European issue. The US Federal Trade Commission just fined a smart TV provider for recording viewing habits of 11 million customers and selling them to third parties.

Beyond privacy freedom of expression at stake

One of the particularities of the discussion on smart TV privacy is that it is being dealt almost exclusively as an issue of data protection and privacy, and that the debate is completely oblivious to the broader and at least equally worrying implications for freedom of expression and media law and policy. In its 2013 Green Paper on Convergence, for example, the European Commission does acknowledge the fact that “the processing of personal data is often the prerequisite for the functioning of new services, even though the individual is often not fully aware of the collection and processing of personal data”. However, the document makes it very clear that the European Commission believes that these matters should, in the first place, be a matter for EU data protection regulation. We argue, conversely, that the issue of users’ viewing privacy is also a matter for media and communication law and policy, at both the level of the EU and its member states. This is because of the potential impact that tracking users online can have on users’ freedom to inform themselves and exercise their rights to freedom of expression. Privacy in this context is instrumental in furthering our freedom of expression rights, which is why the privacy of media users’ privacy deserves special attention. The tv set is not only the device that sees us lounging in our pyjamas on the couch (in itself reason enough to be worried about privacy). What is more important even, we use TV services to inform ourselves and prepare us for our role as informed citizens. As scholars have convincingly argued, a certain level of intellectual privacy or breathing space is indispensable for our ability to form our ideas and opinions – unmonitored by device producers, advertisers and the CIA.

The Council of Europe noted explicitly in the context of tracking users online that “[t]hese capabilities and practices can have a chilling effect on citizen participation in social, cultural and political life and, in the longer term, could have damaging effects on democracy. … More generally, they can endanger the exercise of freedom of expression and the right to receive and impart information protected under Article 10 of the European Convention on Human Rights”.

German data protection authorities observe that “[t]elevision is a central medium for conveying information and an essential condition for freedom of expression. The right to unobstructed information access is constitutionally protected and a basic prerequisite for the democratic order. The comprehensive collection, processing and using of information about user behaviour will adversely affect the exercise of that right”. The Dutch data protection authority underscores that personal data collected via smart TV is sensitive because it can reveal very individual patterns which could potentially disclose the specific social background, financial or family situation. These initiatives confirm that the user of media services may require a higher, or at least different levels of protection when consuming media content than, for example, when buying shoes online. A talk by Alexander Nix of Cambridge Analytics (listen in at 9’ 22” of the video) shows just how much companies want to tap into the data of people’s viewing behaviour.

So far, only Germany has specific privacy safeguards in its media law (Section 13 and 15 Act on Telemedia). Remarkably, in Germany viewers have the right to use their TV anonymously, insofar as this is technically feasible. German law moreover prohibits the sharing of personal data on the use of television and on-demand audiovisual media services with third parties. Only anonymised usage data can be shared with third parties for market research and analytical purposes. This ties in with literature that argues in favour of protecting privacy in order to preserve the freedom to receive information and hold opinions.

Get it done with the new e-Privacy Regulation

Thankfully, there are several opportunities for the European legislator to show that it has gotten the wake-up call and is moving into action to protect media users. The draft for a revised Audiovisual Media Service Directive does not mention privacy once – this should change, and the European legislator should follow the example of Germany, and introduce a right to read anonymously and protection from unauthorised data sharing at the European level.

Then, there is the new legislative proposal for a Privacy and Electronic Communications Regulation that will replace today’s e-Privacy Directive. Elsewhere we explain how the Privacy and Electronic Communications Regulation could become a suitable vehicle to protect the confidentiality and security of users of interactive televisions and online content services.

The European legislator should use his exclusive competence from Article 16 of the Treaty on the Functioning of the European Union in the area of data protection to introduce provisions that protect the confidentiality and security of media consumption (and not only information in transit) against unauthorised eavesdropping from within and outside the European Union.

Remains to conclude with the note that the Angels in Doctor Who are quantum-looked creatures: they cease to move when observed – were only the intelligence agency that easy to prevent from hijacking our devices.

Source: The Weeping Angels are back, and they attack our privacy via smart TVs | Internet Policy Review

Algorithms we want | n.n. — notes & nodes on society, technology and the space of the possible, by felix stalder

If we now demand that algorithms have to be made better in this applied sense, we only demand that the program that was built into them should work better. But this program is not just harmless efficiency, the definition of the problems and the possible solutions almost always corresponds to a neo-liberal world view. By this I mean three things: First, the society is individualized. Everything is attributed to individual, specifiable persons separated by their differences from one another and action means first and foremost individual action. Second, the individuals so identified are placed in a competing relationship with one another, through all sorts of rankings on which one’s own position relative to that of others can rise or fall. And, third, the group or the collective – which expresses the consciousness of its members as related to one another – is replaced by the aggregate, which is essentially formed without the knowledge of the actors. Either because it is supposed to emerge spontaneously, as Friedrich von Hayek thought it, or because it is constructed behind the backs of the people in the closed depths of the data centers. Visible to a few actors only.

Source: Algorithms we want | n.n. — notes & nodes on society, technology and the space of the possible, by felix stalder

Facebook Must Acknowledge and Change Its Financial Incentives – NYTimes.com

Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm. First steps could include more transparency about the business model driving these changes, incorporating opportunities for comment from members of civil society and the news industry, and creating an internal team dedicated to media ethics concerns, with an explicit mission statement driven by values rather than increasing clicks and views.

Source: Facebook Must Acknowledge and Change Its Financial Incentives – NYTimes.com

How to Hold Algorithms Accountable

lgorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But despite the potential for efficiency gains, algorithms fed by big data can also amplify structural discrimination, produce errors that deny services to individuals, or even seduce an electorate into a false sense of security. Indeed, there is growing awareness that the public should be wary of the societalrisks posed by over-reliance on these systems and

Source: How to Hold Algorithms Accountable

Algorithmic selection on the Internet – Media Change & Innovation – IPMZ – University of Zurich

Publications

  • Just, Natascha / Latzer, Michael (2016): Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet. In: Media, Culture & Society [accepted manuscript, forthcoming online before print]. [pdf]
  • Dörr, Konstantin / Hollnbuchner, Katharina (2016): Ethical Challenges of Algorithmic Journalism. In: Digital Journalism [accepted manuscript; forthcoming online before print]. [pdf]
  • Latzer, Michael / Hollnbuchner, Katharina / Just, Natascha / Saurwein, Florian (2016): The economics of algorithmic selection on the Internet. In: Bauer, J. and Latzer, M. (Eds), Handbook on the Economics of the Internet. Cheltenham, Northampton: Edward Elgar, 395-425. [pdf]
  • Saurwein, Florian / Just, Natascha / Latzer, Michael (2015): Governance of algorithms: options and limitations. In: info, Vol. 17 (6), 35-49. [pdf]
  • Dörr, Konstantin (2015): Mapping the field of Algorithmic Journalism. In: Digital Journalism [online before print]. [pdf]

 

 

Society in the Loop Artificial Intelligence – Joi Ito’s Web

Iyad Rahwan was the first person I heard use the term society-in-the-loop machine learning. He was describing his work which was just published in Science, on polling the public through an online test to find out how they felt about various decisions people would want a self-driving car to make – a modern version of what philosophers call “The Trolley Problem.” The idea was that by understanding the priorities and values of the public, we could train machines to behave in ways that the society would consider ethical. We might also make a system to allow people to interact with the Artificial Intelligence (AI) and test the ethics by asking questions or watching it behave.

Source: Society in the Loop Artificial Intelligence – Joi Ito’s Web

What publishers should know about the new EU data laws

Dominic Perkins, commercial development director at Time Inc., asked if it makes more sense for publishers to create data co-ops, creating a single identifier in order to be more transparent, and sharing that among them.

Probably, answered Yves Schwarzbart, acting head of policy and regulatory affairs at the IAB U.K. “Simplifying the data processes is in everyone’s interests,” he said. “It needs to start with someone.”

Source: The European Commission has just passed a new EU wide data law, great for personal data protection, not so good for publishers