Filter bubbles of course create interesting business opportunities. Market your brand with its anti-filter bubble efforts, just like brands distinguish themselves with a privacy protective stance.
En, niet onbelangrijk, er liggen grote kansen voor merken! Veel merken hebben programma’s lopen op het gebied van duurzaamheid of maatschappelijke verantwoordelijkheid. Voor zover ik weet is er nog geen enkel merk dat zich al gestort heeft op het tackelen van de filter bubble. Zou het iets voor Ziggo zijn om een online lesprogramma te ontwikkelen dat kinderen voorbereidt op de wereld van digitale meningen? Of kan Campina niet haar verpakkingen gebruiken voor een leuke tips and tricks campagne? Kan de NS die gratis wifi in de treinen niet gebruiken om reiziger op een ludieke manier bewust te maken van de bubble?
If we now demand that algorithms have to be made better in this applied sense, we only demand that the program that was built into them should work better. But this program is not just harmless efficiency, the definition of the problems and the possible solutions almost always corresponds to a neo-liberal world view. By this I mean three things: First, the society is individualized. Everything is attributed to individual, specifiable persons separated by their differences from one another and action means first and foremost individual action. Second, the individuals so identified are placed in a competing relationship with one another, through all sorts of rankings on which one’s own position relative to that of others can rise or fall. And, third, the group or the collective – which expresses the consciousness of its members as related to one another – is replaced by the aggregate, which is essentially formed without the knowledge of the actors. Either because it is supposed to emerge spontaneously, as Friedrich von Hayek thought it, or because it is constructed behind the backs of the people in the closed depths of the data centers. Visible to a few actors only.
Research shows the lack of diverse political views on your Facebook feed is more down to self-censorship than any algorithm.
A central irony of concerns about echo chambers is that they span the political spectrum. The Wall Street Journal interactive was widely shared on Twitter and Facebook, crossing my feed via the furrowed brows of both liberals and conservatives in my network. To be fair, they’re not wrong that there is something to worry about: Polarization is real, and it is true that dedicated partisans can have outsize influence (especially on social media) regardless of their number. But social media may be part of the solution rather than the source of the problem.
The results of such politically heterogeneous connections are similarly remarkable: yes, some 39% of social media users say they’ve changed their settings to filter out political posts or block certain users in their network; this could be seen as an attempt to build the echo chamber, of course, but in itself is also a clear sign that those filtering mechanisms are as yet far from effective. But conversely, some 20% of users also state that they’ve changed their minds about a political or social issue becau
Source: Echo Chamber? What Echo Chamber?
Facebook must create institutionalized pathways for journalists and policymakers to help shape any further changes to the algorithm. First steps could include more transparency about the business model driving these changes, incorporating opportunities for comment from members of civil society and the news industry, and creating an internal team dedicated to media ethics concerns, with an explicit mission statement driven by values rather than increasing clicks and views.
The debate over who, in the media and in the IT scene, is responsible for Trump’s victory is just starting. I have been claiming for a long time that the issue is not an algorithmic one. The “algorithmic” candidate was Hillary Clinton: she embraced the big data targeting approach that helped Obama’s victory in 2012, and it seems that her campaign was coordinated by a data processing system called Ada. On the contrary, the secret of the Talking Combover’s victorious campaign is the exploitation of crowds of “click workers”, most of them located on the other side of the globe. If Hillary Clinton spent $ 450 millions, by comparison Trump spent less (about half of her budget), by under-paying subcontractors recruited on micro-work platforms. An army of digital pieceworkers living in developing countries Maybe you have read the bitter-sweet news about a Singapore teenager who helped create a Prezi presentation for Trump. She was recruited on Fiverr, a platform where, for a few bucks, you
n November 7, 2016, the day before the US election, I compared the number of social media followers, website performance, and Google search statistics of Hillary Clinton and Donald Trump. I was shocked when the data revealed the extent of Trump’s popularity. He had more followers across all social platforms and his posts had much higher engagement rates. I noticed that the second most popular article shared on social media in the last six months with words “Donald Trump” in the headline, “Why I’m Voting
lgorithms are now used throughout the public and private sectors, informing decisions on everything from education and employment to criminal justice. But despite the potential for efficiency gains, algorithms fed by big data can also amplify structural discrimination, produce errors that deny services to individuals, or even seduce an electorate into a false sense of security. Indeed, there is growing awareness that the public should be wary of the societalrisks posed by over-reliance on these systems and