lgorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But despite the potential for efficiency gains, algorithms fed by big data can also amplify structural discrimination, produce errors that deny services to individuals, or even seduce an electorate into a false sense of security. Indeed, there is growing awareness that the public should be wary of the societalrisks posed by over-reliance on these systems and
In the wake of the US election, concerns are surfacing over the filter bubbles that mediate the information people see in their social media feeds.Filter bubbles are formed by the algorithms social media sites like Facebook use to decide which information to show you, based largely on your own tastes. The idea is to keep you engaged, but the result may be a worldview skewed to fit your own preferences and biases. With 62 per cent of Americans getting their news from social media at least occasionally, t
(An investigation in which we decide to use Facebook’s social graph API to see whether fake news or real news is more viral). UPDATE: Since posting, there has been some discussion about this post’s use of the phrase “top stories from local newspapers”. A clarification on how that phrase is used has been appended at the end of the post with some methodology, and some small clarifying edits have been made. The title and core claim of the post remains accurate and stands. What we present here is not the best
In the final three months of the US presidential campaign, the top-performing fake election news stories on Facebook generated more engagement than the top stories from major news outlets such as the New York Times, Washington Post, Huffington Post, NBC News, and others, a BuzzFeed News analysis has found.
During these critical months of the campaign, 20 top-performing false election stories from hoax sites and hyperpartisan blogs generated 8,711,000 shares, reactions, and comments on Facebook.
Within the same time period, the 20 best-performing election stories from 19 major news websites generated a total of 7,367,000 shares, reactions, and comments on Facebook. (This analysis focused on the top performing link posts for both groups of publishers, and not on total site engagement on Facebook. For details on how we identified and analyzed the content, see the bottom of this post. View our data here.)
Up until those last three months of the campaign, the top election content from major outlets had easily outpaced that of fake election news on Facebook. Then, as the election drew closer, engagement for fake content on Facebook skyrocketed and surpassed that of the content from major news outlets.
Since Tuesday’s election, there’s been a lot of finger pointing, and many of those fingers are pointing at Facebook, arguing that their newsfeed algorithms played a major role in spreading misinformation and magnifying polarization. Some of the articles are thoughtful in their criticism, others thoughtful in their defense of Facebook, while others are full of the very misinformation and polarization that they hope will get them to the top of everyone’s newsfeed. But all of them seem to me to make a fundamen
In addition to doing more to weed out lies and false propaganda, Facebook could tweak its algorithm so that it does less to reinforce users’ existing beliefs, and more to present factual information. This may seem difficult, but perhaps the Silicon Valley billionaires who helped create this problem should take it on before setting out to colonize Mars.Facebook should also allow truly independent researchers to collaborate with its data team to understand and mitigate these problems. A more balanced newsfeed might lead to less “engagement,” but Facebook, with a market capitalization of more than $300 billion and no competitor in sight, can afford this.
“Trump-elect” signals that we’ve entered an entirely different league of data-driven campaigning — akathe top U.S. political donor of #Election2016 billionaire psyops hedge funded-backed SuperPAC military-grade data hunger games — aka throw Magic Sauce on 240 million people and wait to see what sticks. In a CA Wall Street Journal story in October 2016, politics reporter Michael Kranish said:
In response to a brief blog post about the recent launch of BBC’s mobile app that will offer customisable content and personalized services, it is interesting to read through the comments of users to get an impression of some of their concerns (irrespective of whether these concerns are justified or not). Clearly, more study is needed not only on how to design privacy-friendly recommendation services, but also on how to communicate respect for privacy to users.
Joseph Turow, Professor of Communication at the Annenberg School for Communication, explores the increasingly important role of data collection and the quantification of the individual in one of ou…