Algorithms and Society

The digitization of society brings about a structural change of the public sphere and the private, in which algorithms – quite literally – play a decisive role. We all interact in and with socio-technical systems, e.g. social media, search engines, online stores, job application platforms, news and information platforms. In these systems, algorithms play a major role in deciding which content, groups, people or institutions are presented or recommended to us and how they are being priorized. Algorithms often take the concrete behavior of users as a starting point and thus create a complex recursive interaction between the operating algorithm and human action or experience. With this, artificial intelligence, algorithms and automated processes create dynamics that might not be perceivable by users but do create social structures that substantially influence our individual lives and society as a whole. Whether or not these consequences are desirable can only be discussed and evaluated if we know precisely how digital technologies, the Web and the algorithms therein shape social structures.

This is why GESIS studies the mechanisms of socio-technical systems in order to understand the social change they bring about and to improve the basis for informed and "good" decisions. We do this through collecting digital behavioral data on societal issues, conducting online experiments to analyze behavioral patterns and their susceptibility in digital environments, and developing analytical tools. One of the most pressing social issues is inequality. Algorithms can reinforce existing social inequality or generate new distortions or discrimination. We investigate how distortions (e.g. gender bias) occur in digital practice and how, on the other hand, algorithms and AI can be used to counteract structural inequality and injustice or misinformation.

  • Beytía, Pablo, and Claudia Wagner. 2022. "Visibility layers: a framework for systematising the gender gap in Wikipedia content." Internet Policy Review 11 (1). doi: https://doi.org/10.14763/2022.1.1621.
  • Soldner, Felix, Bennett Kleinberg, and Shane Johnson. 2022. "Confounds and overestimations in fake review detection: Experimentally controlling for product-ownership and data-origin." PLoS ONE 17 (12): e0277869. doi: https://doi.org/10.1371/journal.pone.0277869.
  • Ulloa, Roberto, Ana Carolina Richter, Mykola Makhortykh, Aleksandra Urman, and Celina Kacperski. 2022. "Representativeness and face-ism: Gender bias in image search." New Media & Society 26 (6). doi: https://doi.org/10.1177/14614448221100699.
  • Ulloa, Roberto, Mykola Makhortykh, and Aleksandra Urman. 2022. "Scaling up search engine audits: Practical insights for algorithm auditing." Journal of Information Science 50 (2). doi: https://doi.org/10.1177/01655515221093029.
  • Hagen, Lutz M., Mareike Wieland, and Anne-Marie In der Au. 2017. "Algorithmischer Strukturwandel Der Öffentlichkeit: Wie Die Automatische Selektion Im Social Web Die Politische Kommunikation verändert Und Welche Gefahren Dies Birgt." MedienJournal 41 (2): 127-143. doi: https://doi.org/10.24989/medienjournal.v41i2.1476.
Title Start End Funder
Political polarization and individualized online information environments: A longitudinal tracking study (POLTRACK)
2022-01-01 2025-08-30 SAW (Leibniz)
NFDI for Data Science and Artificial Intelligence (NFDI4DS)
2021-10-01 2026-09-30 DFG
Dehumanization Online: Measurement and Consequences (Professorinnenprogramm) (DeHum)
2021-01-01 2026-09-30 SAW (Leibniz)

Find out more about our consulting and services: