The Guardian reveals Facebook’s manuals for moderators
Sue Halpern June 8, 2017
Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy by Daniel Kreiss Oxford University Press, 291 pp., $99.00; $27.95 (paper)
Hacking the Electorate: How Campaigns Perceive Voters by Eitan D. Hersch Cambridge University Press, 261 pp., $80.00; $30.99 (paper)
Donald Trump; drawing by James Ferguson
Not long after Donald Trump’s surprising presidential victory, an article published in the Swiss weekly Das Magazin, and reprinted online in English by Vice, began churning through the Internet. While pundits were dissecting the collapse of Hillary Clinton’s campaign, the journalists for Das Magazin, Hannes Grassegger and Mikael Krogerus, pointed to an entirely different explanation—the work of Cambridge Analytica, a data science firm created by a British company with deep ties to the British and American defense industries.According to Grassegger and Krogerus, Cambridge Analytica had used psychological data culled from Facebook, paired with vast amounts of consumer information purchased from data-mining companies, to develop algorithms that were supposedly able to identify the psychological makeup of every voter in the American electorate. The company then developed political messages tailored to appeal to the emotions of each one. As the New York Times reporters Nicholas Confessore and Danny Hakim described it: A voter deemed neurotic might be shown a gun-rights commercial featuring burglars breaking into a home, rather than a defense of the Second Amendment; political ads warning of the dangers posed by the Islamic State could be targeted directly at voters prone to anxiety….Even more troubling was the underhanded way in which Cambridge Analytica appeared to have obtained its information. Using an Amazon site called Mechanical Turk, the company paid one hundred thousand people in the United States a dollar or two to fill out an online survey. But in order to receive payment, those people were also required to download an app that gave Cambridge Analytica access to the profiles of their unwitting Facebook friends.
By Elizabeth Denham, Information Commissioner.
In March we announced we were conducting an assessment of the data protection risks arising from the use of data analytics, including for political purposes.
Engagement with the electorate is vital to the democratic process. Given the big data revolution it is understandable that political campaigns are exploring the potential of advanced data analysis tools to help win votes. The public have the right to expect that this takes place in accordance with the law as it relates to data protection and electronic marketing.
What is the role of search in shaping opinion? Survey results indicate that, among others: 1. The filter bubble argument is overstated, as Internet users expose themselves to a variety of opinions and viewpoints online and through a diversity of media. Search needs to be viewed in a context of multiple media. 2. Concerns over echo chambers are also overstated, as Internet users are exposed to diverse viewpoints both online and offline. Most users are not silenced by contrasting views, nor do they silence those who disagree with them. 3. Fake news has attracted disproportionate levels of concern, in light of people’s actual practices. Internet users are generally skeptical of information across all media and know how to check the accuracy and validity of information found through search, on social media, or on the Internet in general.
Source: Search and Politics: The Uses and Impacts of Search in Britain, France, Germany, Italy, Poland, Spain, and the United States by William H. Dutton, Bianca Christin Reisdorf, Elizabeth Dubois, Grant Blank :: SSRN
When Instagram announced the implementation of algorithmic personalization on their platform a heated debate arose. Several users expressed instantly their strong discontent under the hashtag #RIPINSTAGRAM. In this paper, we examine how users commented on the announcement of Instagram implementing algorithmic personalization. Drawing on the conceptual starting point of framing user comments as “counter-narratives” (Andrews, 2004), which oppose Instagram’s organizational narrative of improving the user experience, the study explores the main concerns users bring forth in greater detail. The two-step analysis draws on altogether 8,645 comments collected from Twitter and Instagram. The collected Twitter data were used to develop preliminary inductive categories describing users’ counter-narratives. Thereafter, we coded all Instagram data extracted from Instagram systematically in order to enhance, adjust and revise the preliminary categories. This inductive coding approach (Mayring, 2000) combined with an in-depth qualitative analysis resulted in the identification of the following four counter-narratives brought forth by users: 1) algorithmic hegemony; 2) violation of user autonomy; 3) prevalence of commercial interests; and 4) deification of mainstream. All of these counter-narratives are related to ongoing public debates regarding the social implications of algorithmic personalization. In conclusion, the paper suggests that the identified counter-narratives tell a story of resistance. While technological advancement is generally welcomed and celebrated, the findings of this study point towards a growing user resistance to algorithmic personalization.
I believe the social media giant could target ads at depressed teens and countless other demographics. But so what?
“The tech industry is in the middle of a massive, uncontrolled social experiment. Having made commercial mass surveillance the economic foundation of our industry, we are now learning how indiscriminate collections of personal data, and the machine learning algorithms they fuel, can be put to effective political use. Unfortunately, these experiments are being run in production. Our centralized technologies could help authoritarians more than they help democracy, and the very power of the tools we’ve built for persuasion makes it difficult for us to undo the damage done. What can concerned people in the tech industry do to seize a dwindling window of opportunity, and create a less monstrous online world?”
Leaked 2017 document reveals FB Australia’s intent to exploit teens’ words, images.
Published online: 20 Jun 2016
Edgar Meij, a senior data scientist on Bloomberg’s news search experience team, is working to explain the logic behind related entities in search results.
One of the first steps toward that goal is to design an algorithm that can explain the relationship between two terms – called entities – in plain English. In a paper together with two researchers from the University of Amsterdam, Prof. Dr. Maarten de Rijke and Nikos Voskarides, he presents a methodology to do just that.