Gobo retrieves posts from people you follow on Twitter and Facebook and analyzes them using simple machine learning-based filters. You can set those filters — seriousness, rudeness, virality, gender and brands — to eliminate some posts from your feed. The “politics” slider works differently, “filtering in”, instead of “filtering out” — if you set the slider towards “lots of perspectives”, our “news echo” algorithm will start adding in posts from media outlets that you likely don’t read every day.
The Web is a key space for civic debate and the current battleground for protecting freedom of expression. However, since its development, the Web has steadily evolved into an ecosystem of large, corporate-controlled mega-platforms which intermediate speech online. In many ways this has been a positive development; these platforms improved usability and enabled billions of people to publish and discover content without having to become experts on the Web’s intricate protocols.But in other ways this development is alarming. Just a few large platforms drive most traffic to online news sources in the U.S., and thus have enormous influence over what sources of information the public consumes on a daily basis. The existence of these consolidated points of control is troubling for many reasons. A small number of stakeholders end up having outsized influence over the content the public can create and consume. This leads to problems ranging from censorship at the behest of national governments to more subtle, perhaps even unintentional, bias in the curation of content users see based on opaque, unaudited curation algorithms. The platforms that host our networked public sphere and inform us about the world are unelected, unaccountable, and often impossible to audit or oversee.
At the same time, there is growing excitement around the area of decentralized systems, which have grown in prominence over the past decade thanks to the popularity of the cryptocurrency Bitcoin. Bitcoin is a payment system that has no central points of control, and uses a novel peer-to-peer network protocol to agree on a distributed ledger of transactions, the blockchain. Bitcoin paints a picture of a world where untrusted networks of computers can coordinate to provide important infrastructure, like verifiable identity and distributed storage. Advocates of these decentralized systems propose related technology as the way forward to “re-decentralize” the Web, by shifting publishing and discovery out of the hands of a few corporations, and back into the hands of users. These types of code-based, structural interventions are appealing because in theory, they are less corruptible and resistant to corporate or political regulation. Surprisingly, low-level, decentralized systems don’t necessarily translate into decreased market consolidation around user-facing mega-platforms.In this report, we explore two important ways structurally decentralized systems could help address the risks of mega-platform consolidation: First, these systems can help users directly publish and discover content directly, without intermediaries, and thus without censorship. All of the systems we evaluate advertise censorship-resistance as a major benefit. Second, these systems could indirectly enable greater competition and user choice, by lowering the barrier to entry for new platforms. As it stands, it is difficult for users to switch between platforms (they must recreate all their data when moving to a new service) and most mega-platforms do not interoperate, so switching means leaving behind your social network. Some systems we evaluate directly address the issues of data portability and interoperability in an effort to support greater competition.We offer case studies of the following decentralized publishing projects:Freedom Box, a system for personal publishingDiaspora, a federated social networkMastodon, a federated Twitter-like serviceBlockstack, a distributed system for online identity servicesIPFS (Interplanetary File System), a distributed storage service with a proposed mechanism to incentivize resource sharingSolid (Social Linked Data), a linked-data protocol that could act as a back-end for data sharing between social media networksAppcoins, a digital currency framework that enables users to financially participate in ownership of platforms and protocolsSteemit, an online community that uses an appcoin to incentivize development and community participation in a social networkConsidering these projects as a whole, we found a robust and fertile community of experimenters developing promising software. Many of the projects in this report are working on deeply exciting new ideas. Easy to use, peer-to-peer distributed storage systems change the landscape for content censorship and archiving. Appcoins may transform how new projects are launched online, making it possible to fund open-source development teams focused on developing shared protocols instead of independent companies. There is also a renewed interest in creating interoperable standards and protocols that can cross platforms.However, we have reason to doubt that these decentralized systems alone will address the problems of exclusion and bias caused by today’s mega-platforms. For example, distributed, censorship-resistant storage does not help address problems related to bias in curation algorithms – content that doesn’t appear at the top of your feed might as well be invisible, even if it’s technically
Source: The Decentralized Web
How significant is algorithmic personalization in searches for political parties and candidates?
Shoppers with Internet access and a bargain-hunting impulse can find a universe of products at their fingertips. In this thought-provoking exposé, Ariel Ezrachi and Maurice Stuckeinvite us to take a harder look at today’s app-assisted paradise of digital shopping. While consumers reap many benefits from online purchasing, the sophisticated algorithms and data-crunching that make browsing so convenient are also changing the nature of market competition, and not always for the better.
Computers colluding is one danger. Although long-standing laws prevent companies from fixing prices, data-driven algorithms can now quickly monitor competitors’ prices and adjust their own prices accordingly. So what is seemingly beneficial—increased price transparency—ironically can end up harming consumers. A second danger is behavioral discrimination. Here, companies track and profile consumers to get them to buy goods at the highest price they are willing to pay. The rise of super-platforms and their “frenemy” relationship with independent app developers raises a third danger. By controlling key platforms (such as the operating system of smartphones), data-driven monopolies dictate the flow of personal data and determine who gets to exploit potential buyers.
Virtual Competition raises timely questions. To what extent does the “invisible hand” still hold sway? In markets continually manipulated by bots and algorithms, is competitive pricing an illusion? Can our current laws protect consumers? The changing market reality is already shifting power into the hands of the few. Ezrachi and Stucke explore the resulting risks to competition, our democratic ideals, and our economic and overall well-being.
What is PolitEcho?PolitEcho shows you the political biases of your Facebook friends and news feed. The app assigns each of your friends a score based on our prediction of their political leanings then displays a graph of your friend list. Then it calculates the political bias in the content of your news feed and compares it with the bias of your friends list to highlight possible differences between the two.
- Just, Natascha / Latzer, Michael (2016): Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet. In: Media, Culture & Society [accepted manuscript, forthcoming online before print]. [pdf]
- Dörr, Konstantin / Hollnbuchner, Katharina (2016): Ethical Challenges of Algorithmic Journalism. In: Digital Journalism [accepted manuscript; forthcoming online before print]. [pdf]
- Latzer, Michael / Hollnbuchner, Katharina / Just, Natascha / Saurwein, Florian (2016): The economics of algorithmic selection on the Internet. In: Bauer, J. and Latzer, M. (Eds), Handbook on the Economics of the Internet. Cheltenham, Northampton: Edward Elgar, 395-425. [pdf]
- Saurwein, Florian / Just, Natascha / Latzer, Michael (2015): Governance of algorithms: options and limitations. In: info, Vol. 17 (6), 35-49. [pdf]
- Dörr, Konstantin (2015): Mapping the field of Algorithmic Journalism. In: Digital Journalism [online before print]. [pdf]
What is algorithmic fairness and why is it important?This site serves to collect articles and research that will help to answer these questions.Our own take on the research questions behind these issues can be found in this paper. More research is collected at fatml.org.
A lot of research has been carried out around using data analysis to identify different aspects of online behavior. Before I detail some of it below, I should add a note of caution: all analytics are only as good as how they are utilized in decision making by end users.The complex interplay between computational tools and human actors in sociotechnical systems (such as online communities) means that great technology and analytics can still fall flat if the community policies aren’t “right”. Engagement e
Beyond IRBs: Ethical Guidelines for Data Research
by Omer Tene and Jules Polonetsky This article focuses specifically on issues related to data-driven research, which is an area where the notion of harm is still hotly debated and both benefit and risk are typically intangible.
Source: Papers | Big Data Ethics