Category Archives: related projects


Fusion of Active Information for Next Generation Recommender Systems

The CrowdRec project pursues three objectives:

  • Stream Recommendation: real-time combination of information from collection, context, user interaction and user community to generate social smartfeeds for large-scale social networks;
  • Crowd Engagement: creating symbiosis between users and content that activates users to contribute;
  • Deployment and Validation: developing and testing for release of reference implementations and large-scale user trials.

For the reference framework containing implementations of algorithms that have been developed within the CrowdRec project,

Source: CrowdRec

Psychologists Use Social Networking Behavior to Predict Personality Type | MIT Technology Review

The ability to automatically determine personality type could change the way social networks target services to usersOne of the foundations of modern psychology is that human personality can be described in terms of five different forms of behavior. These are: 1. Agreeableness–being helpful, cooperative and sympathetic towards others 2. Conscientiousness–being disciplined, organized and achievement-oriented 3. Extraversion–having a higher degree of sociability, assertiveness and talkativeness 4. Neuroticism–the degree of emotional stability, impulse control and anxiety 5. Openness–having a strong intellectual curiosity and a preference for novelty and varietyPsychologists have spent much time and many years developing tests that can classify people according to these criteria. Today, Shuotian Bai at the Graduate University of Chinese Academy of Sciences in Beijing and a couple of buddies say they have developed an online version of the test that can determine an individual’s personality traits from their behavior on a social network such as Facebook or Renren, an increasingly popular Chinese competitor.

Source: Psychologists Use Social Networking Behavior to Predict Personality Type | MIT Technology Review

Algorithmic Regimes and Generative Strategies | World-Information Institute

Friday 4th December 2015 19:00ALGORITHMS ARE NOT ANGELSwith Matthew Fuller and Graham HarwoodAcademy of Fine Arts Vienna, Atelierhaus, Lehargasse 8, A-1060The continuation of the series of events on the regulatory politics of code and machines brings to Vienna two researchers and artists that investigate these issues since the mid-90s. Matthew Fuller and Graham Harwood from the Centre for Cultural Studies Goldsmiths University London call for a better understanding of digital systems in culture, politics and everyday life and demonstrate that the idea of neutral objectivity in connection with algorithms is misleading.Entry: FreeKonrad Becker, Felix Stalder, World-Information Institute in cooperation with Kunst und Digitale Medien at the Academy of Fine Arts Vienna. Supported by SHIFT Vienna and BKA Kunst Saturday 5th December 2015Invisible Algorithm CollegeEphemeral Explorations on Rule based Terrain with Matthew Fuller and Graham HarwoodEnrolment by In Situ Admission Formula only International Conference Friday 25th September 2015 at TU Karlsplatz, 1040 ViennaA conference that investigates the growing influence of digital control systems and their cascading chains of agency on the cultural and social reality. #algoregimesBecause of their technical nature algorithms are often presented as a guarantee of objectivity, particularly on controversial issues. But the way in which data for the algorithm is processed affects the results and what an index takes account of or not is relevant as decisions for inclusion or exclusion. Algorithms raise numerous questions regarding the methodology and the claims on knowledge associated with it. Even if automated processes are ideally adapted to the computer logic of syllogisms, the complexity of the world outside cannot be reduced to unambiguous statements that can be combined easily.Addressing issues of the politics of algorithms, normative classifications and algorithmic governmentality “Algorithmic Regimes and Generative Strategies” wants to mobilize the critical perspectives of researchers, artists and activists, to open the field for a wider and more diverse debate.

Source: Algorithmic Regimes and Generative Strategies | World-Information Institute

Reading list | Governing Algorithms

Reading list

This is an open reading list we assembled in preparation of the Governing Algorithms conference. Please add suggestions below – or download a PDF version.

Last updated: March 21, 2013

Anderson, C W. (2012). ‘Towards a sociology of computational and algorithmic journalism’, New Media & Society.

Barocas, S. (2012). ‘The Price of Precision: Voter Microtargeting and Its Potential Harms to the Democratic Process’, PLEAD’12, November 2, 2012, Maui, Hawai.

Beer, D. (2009). ‘Power through the algorithm? Participatory Web cultures and the technological unconscious’, New Media & Society 11(6): 985-1002.

Bucher, T. (2012). ‘Want to be on the top? Algorithmic power and the threat of invisibility on Facebook’, New Media & Society, first published on April 8, 2012.

Cheney-Lippold, J. (2011). A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control. Theory, Culture & Society, 28(6), 164–181.

Custers, B., Zarsky, T., Schermer, B., & Calders, T. (2013). Discrimination and privacy in the information society data mining and profiling in large databases. Berlin: Springer.

D’Amato, A. (1977). ‘Can/Should Computers Replace Judges?’, Georgia Law Review 11: 1277-1301.

Domingos, P. (2012). A few useful things to know about machine learning. Communications of the ACM, 55(10), 78.

Ensmenger N. (2012). ‘Is chess the drosophila artificial intelligence? A social history of an algorithm’, Soc Stud Sci. 42(1): 5-30.

–– (2003). Letting the “Computer Boys” Take Over: Technology and the Politics of Organizational Transformation. International Review of Social History, 48(S11), 153–180.

Friedman, B. and H. Nissenbaum (1996). ‘Bias in computer systems’, ACM Transactions on Information Systems 14: 330-347.

Galloway, A. N. (2006). Gaming: Essays on Algorithmic Culture, University of Minnesota Press.

Gillespie, T. (2011). ‘Can an algorithm be wrong? Twitter Trends, the specter of censorship, and our faith in the algorithms around us’, Culture Digitally Blog, Oct 19,

–– (2012). ‘The relevance of algorithms’, in Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot (eds), Media Technologies (Cambridge, MA: MIT Press).

Gitelman, Lisa (2013). “Raw Data” is an Oxymoron. Cambridge, MA: MIT Press.

Goffey, A. (2008). ‘Algorithm’, in: Fuller, M. (ed), Software studies: A lexicon, MIT Press, pp. 15-20.

Good, I. J. (1983). ‘The philosophy of exploratory data analysis’, Philosophy of Science, 283–295.

Greiffenhagen, C. (2008). Video analysis of mathematical practice? Different attempts to “open up” mathematics for sociological investigation. Forum: Qualitative Social Research 9(3), art. 32.

Helmreich, S. (1998). Recombination, Rationality, Reductionism and Romantic Reactions:: Culture, Computers, and the Genetic Algorithm. Social Studies of Science, 28(1), 39–71.

Hildebrandt, M (2010). “The meaning and the mining of legal texts”. Presentation at The Computational Turn in the Humanities. Swansea University. March 2010. A further developed version will be published in: Berry, D. M. (Ed.) (forthcoming, 2011) Understanding Digital Humanities: The Computational Turn and New Technology. London: Palgrave Macmillan.

Introna, L. (2011). ‘The enframing of code: Agency, originality and the plagiarist’, Theory, Culture & Society, 28: 113-141.

Kowalski, R. (1979). ‘Algorithm = Logic + Control’, Communications of the ACM 22(7): 424-436.

Kraemer, F., Overveld, K., & Peterson, M. (2010). ‘Is there an ethics of algorithms? Ethics and Information Technology’, 13(3), 251–260.

Lash, S. (2007). ‘Power after Hegemony: Cultural Studies in Mutation?’, Theory, Culture & Society 24(3): 55-78.

Lenglet, M. (2011). ‘Conflicting Codes and Codings: How Algorithmic Trading Is Reshaping Financial Regulation’, Theory, Culture & Society 28(6): 44-66.

Lohr, Steve (2013). “Algorithms Get a Human Hand in Steering Web,” The New York Times, Mar. 10, 2012.

Lynch, M. (2002). ‘Protocols, practices, and the reproduction of technique in molecular biology’, British Journal of Sociology 53(2): 203–220.

Mager, A. (2012). ‘Algorithmic Ideology’, Information, Communication & Society, 1–19.

Mackenzie, A. (2007). Protocols and the Irreducible Traces of Embodiment: The Viterbi Algorithm and the Mosaic of Machine Time, in: Hassan, R. & Purser R. E., 24/7: Time and temporality in the network society, Stanford University Press: Standford, CA, pp. 89-108.

Morozov, E. (2013). To Save Everything, Click Here: The Folly of Technological Solutionism, PublicAffairs/Perseus Books: New York, NY.

Muniesa, F. (2011). ‘Is a stock exchange a computer solution? Explicitness, algorithms and the Arizona Stock Exchange’, International Journal of Actor-Network Theory and Technological Innovation, 3(1), 1-15.

Noble, D. (1984). Forces of Production: A social history of industrial automation, Alfred E. Knopf: New York, NY.

Parisi, L. (2012). Digital Design and Topological Control. Theory, Culture & Society, 29(4-5), 165–192.

Ramsay, S. (2003). Special Section: Reconceiving Text Analysis: Toward an Algorithmic Criticism. Literary and Linguistic Computing, 18(2), 167–174.

Schüll, N. D. (2012). Addiction by design: Machine gambling in Las Vegas, Princeton University Press: Princeton, NJ.

Schwartz, P. M. (2010). Data Protection Law and The Ethical Use of Analytics (pp. 1–30). Washington, DC: The Centre for Information Policy Leadership, Hunton & Williams LLP.

Slavin, Kevin (2011). How algorithms shape our world, TED Talk,

Steiner, C. (2012). Automate This: How Algorithms Came to Rule Our World, Portfolio/Penguin: New York, NY.

Srinivasan, R. (2012). ‘Re-thinking the cultural codes of new media: The question concerning ontology’, New Media & Society.

Stacey, J. and Suchman, L. (2012). ‘Animation and Automation − The Liveliness and Labours of Bodies and Machines’, Body & Society, 18: 1.

Sweeney, L. (2013). Discrimination in online ad delivery. Harvard University, January 28, 20131.

Symposium on “Computer discovery and the sociology of scientific knowledge” (1989), Social Studies of Science 19(4).

Turing, A.M. (1950). ‘Computing machinery and intelligence’, Mind 59: 433-460.

Uprichard, E., Burrows, R., & Byrne, D. (2008). ‘SPSS as an “inscription device”: from causality to description?’, The Sociological Review, 56(4): 606–622.

Vries, K. (2010). ‘Identity, profiling algorithms and a world of ambient intelligence’, Ethics and Information Technology, 12(1), 71–85.

Webmoor, T. (in press). Algorithmic Alchemy, Or the Work of Code in Coordinating Creativity and Collaborators, in: Visualization in the Age of Computerization, Routledge Studies in Science, Technology and Society, edited by A. Carusi, A. S. Hoel, T. Webmoor and S. Woolgar. London: Routledge.

Zarsky, T. (2004). ‘Desperately Seeking Solutions: Using Implementation-Based Solutions for the Troubles of Information Privacy in the Age of Data Mining and the Internet Society’, Maine Law Review 56 (1): 13-59.

Ziewitz, M. (2011). ‘How to think about an algorithm? Notes from a not quite random walk’, Working paper.

Source: Reading list | Governing Algorithms

Special Issue on “Governing Algorithms” available online | Governing Algorithms

Special Issue of Science, Technology, & Human Values (STHV)
Vol. 41 Issue 1 (2016, forthcoming)Guest Editor: Malte Ziewitz, Cornell University

Governing Algorithms: Myth, Mess, and Method
Malte Ziewitz, Cornell University

Algorithms, Governance, and Governmentality: On Governing Academic Writing
Lucas Introna, Lancaster University

Bearing Account-able Witness to the Ethical Algorithmic System
Daniel Neyland, Goldsmiths

Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics
Kate Crawford, Microsoft Research New England/MIT Center for Civic Media

Toward an Ethics of Algorithms: Convening, Observation, Probability, and Timeliness
Mike Ananny, University of Southern California

The Trouble with Algorithmic Decisions: An Analytic Road Map to Examine Efficiency and Fairness in Automated and Opaque Decision Making
Tal Zarsky, University of Haifa

Source: Special Issue on “Governing Algorithms” available online | Governing Algorithms

Christian Sandvig’s Research Page

Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2015). Can an Algorithm be Unethical? Paper presented to the 65th annual meeting of the International Communication Association, San Juan, Puerto Rico, USA.

Sandvig, C. (2015). Seeing the Sort: The Aesthetic and Industrial Defense of “The Algorithm.” Media-N 11(1): page numbers TBD.

Eslami, M., Aleyasen, A., Karahalios, K., Hamilton, K., and Sandvig, C. (2015). FeedVis: A Path for Exploring News Feed Curation Algorithms. Software demo presented to the 18th Annual Association for Computing Machinery (ACM) Conference on Computer-Supported Cooperative Work (CSCW).

Hamilton, K., Karahalios, K., Sandvig, C. & Eslami, M. (2014). A Path to Understanding the Effects of Algorithm Awareness. In CHI Extended Abstracts on Human Factors in Computing Systems (alt.CHI). ACM, New York, NY, USA, 631-642.

Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014). An Algorithm Audit. In: Seeta Peña Gangadharan (ed.), Data and Discrimination: Collected Essays, pp. 6-10. Washington, DC: New America Foundation.

Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014). “Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms.” Paper presented to “Data and Discrimination,” a pre-conference of the 64th annual meeting of the International Communication Association, Seattle, WA, USA.

Hamilton, K., Karahalios, K., Sandvig, C., & Langbort, C. (2014). “The Image of the Algorithmic City: a Research Approach.” Interaction Design and Architecture(s) (IxDA) 20.

Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2013). “Re-Centering the Algorithm,” Paper presented to “Governing Algorithms: A Conference on Computation, Automation, and Control,” New York University, New York, NY, USA.

Source: Christian Sandvig’s Research Page

Privacy Implications of Health Information Seeking on the Web | March 2015 | Communications of the ACM

Privacy online is an increasingly popular field of study, yet it remains poorly defined. “Privacy” itself is a word that changes according to location, context, and culture. Additionally, the Web is a vast landscape of specialized sites and activities that may only apply to a minority of users—making defining widely shared privacy concerns difficult. Likewise, as technologies and services proliferate, the line between on- and offline is increasingly blurred. Researchers attempting to make sense of this rapidly changing environment are frequently stymied by such factors. Therefore, the ideal object of study is one that is inherently sensitive in nature, applies to the majority of users, and readily lends itself to analysis. The study of health privacy on the Web meets all of these criteria.

Source: Privacy Implications of Health Information Seeking on the Web | March 2015 | Communications of the ACM

Think you’re reading the news for free? New research shows you’re likely paying with your privacy

We found that users were exposed to an average of eight external servers on each site. This means that many hidden third parties (again, usually advertisers) may be simultaneously observing an individual’s browsing habits. But even more surprising was our finding that news organizations appear to be among the most active perpetrators of this practice. Our investigation has revealed that among the 2,000-plus news-related websites identified by Alexa, readers are, on average, connected to over 19 third-par

Source: Think you’re reading the news for free? New research shows you’re likely paying with your privacy