International Center for Ethics in the Sciences and Humanities (IZEW)

DiversPrivat

Based on an ethically grounded re-systematisation of vulnerable groups and their specific requirements and needs in the context of privacy, the DiversPrivat project will develop and test specific protection concepts adapted to the target groups. The aim of the project is to explore a) which specific needs vulnerable groups in particular have with regard to privacy protection and b) which mechanisms can be implemented to raise awareness in daily internet use that behavioural data is systematically collected in digital communication and how this can be prevented. Furthermore, the project c) provides an ongoing media-ethical reflection on communication with and about (vulnerable, marginalised) social groups that do not meet the standards in terms of requirements for privacy literacy. This is to support the free and self-determined use of digital services by vulnerable groups.

Funding

June 2023 - May 2026

Project

Despite decades of research on privacy on the internet, there is still insufficient research on how to better protect or encourage users to protect themselves, especially those who do not take protective measures due to structural reasons, lack of knowledge or lack of motivation. We postulate that the generally useful construct of informed consent based on privacy literacy falls short at this point and needs to be reconceptualised in relation to vulnerable groups. Many users of digital services often give consent to the collection and processing of their data without being aware of the consequences this may have. This problem is particularly acute for those groups of people who, for structural or individual reasons, have less background knowledge about digital privacy. They also often lack the feeling of being able to make self-determined decisions in technical areas.

Since it makes little sense and is not practicable to reach these groups of people, e.g. in evening courses, and thus strengthen privacy literacy, alternative approaches are to be tested instead. Based on an ethically founded re-systematisation of vulnerable groups and their specific requirements and needs in the context of privacy, protection concepts are to be developed and tested. Instead of cognitive mediation, the implications of privacy invasions should be made tangible through real-world, if necessary sensually perceptible signals. This could be done, for example, by evoking emotions through a real-life presentation of the potential consequences of data disclosure on the internet. Building on this, the presentation of awareness-raising signals can be used in subsequent technology interactions as a reminder not to agree too quickly. The aim of the project is therefore to explore a) what specific needs vulnerable groups in particular have in terms of access to privacy protection and b) what mechanisms can be implemented to raise awareness in daily internet use that behavioural data is systematically collected in digital communication and how this can be prevented. Furthermore, the project c) provides an ongoing media-ethical reflection on communication with and about (vulnerable, marginalised) social groups that do not meet the standards of the majority society in terms of competence requirements for privacy literacy. This is to support the free and self-determined use of digital services by vulnerable groups.

At the moment, for example in the GDPR, this free and self-determined use in data protection law is almost exclusively controlled by the mechanism of ”informed consent”, which in turn only works if the individual has sufficient knowledge. “Informed consent” is an ethical and legal principle for legitimising the collection of personal data. In principle, the prerequisites for informed consent should be knowledge, voluntariness and decision-making capacity. According to the individual cognitive and socio-emotional status, these three prerequisites are not equally applicable to all members of society. Certain cognitive abilities (e.g. understanding privacy, assessing consequences for the future) may not (yet) be present or may be limited by communication barriers, but also low levels of experience or structural power asymmetries can lead to individuals or groups being less likely to clearly identifying and expressing their interests (cf. Stapf et al., 2020).

The innovation proposed in the project is to develop alternatives to the correlation of informed consent and privacy literacy based on ethical and psychological competence. Instead of focusing solely on how users can be educated in the area of informational self-determination, as in previous approaches, this project focuses on intuitive behaviour as an alternative to digital literacy, which requires clear media education. It is ethically developed and psychologically-empirically tested to what extent it is helpful, especially for vulnerable groups, to evoke visceral reactions rather than pursue knowledge transfer.

The project thus aims to improve the everyday lives of diverse vulnerable groups. The goal is to design and evaluate methods and procedures to adequately support citizens in expressing their consent, which can serve as a basis for new privacy-by-design procedures. It will also be addressed how the aspects of design and evaluation of user interfaces enable the exercise of rights according to the GDPR independent of the social, technical and cultural background of the users. In particular, the project will highlight the limits of informed consent for data processing in social environments and find and test solution strategies.

Publications

  • Geminn, C. (2020). Der Mensch in Recht und Technik – eine Bestandsaufnahme, in: Hentschel, A., Hornung, G., & Jandt, S. (Hrsg.): Mensch – Technik – Umwelt: Verantwortung für eine sozialverträgliche Zukunft – Festschrift für Alexander Roßnagel zum 70. Geburtstag, Nomos, S. 63-80. 
  • Geminn, C. (2020). Digitalisierung und verletzliche Gruppen im Recht, Kritische Vierteljahresschrift für Gesetzgebung und Rechtswissenschaft 3/2020, Bd. 103, S. 254–288. 
  • Geminn, C. (2023). Deus ex machina? – Grundrechte und Digitalisierung, Jus Publicum – Beiträge zum Öffentlichen Recht Bd. 316, Tübingen: Mohr Siebeck, 2023, XVIII/714 S. (i.E.). 
  • Geminn, C., & Roßnagel, A. (2015). „Privatheit“ und „Privatsphäre“ aus der Perspektive des Rechts – ein Überblick, JuristenZeitung 14/2015, Bd. 70, S. 703-708. 
  • Heesen, J. (2021). Responsible freedom – The democratic challenge regulation of online media. In L. Trifonova Price, K. Sanders, & W. N. Wyatt (Eds.), The Routledge companion to journalism ethics (pp. 433-440). Routledge.
  • Heesen, J., et al. (2021). Kritikalität von KI-Systemen in ihren jeweiligen Anwendungskontexten – Ein notwendiger, aber nicht hinreichender Baustein für Vertrauenswürdigkeit. Whitepaper aus der Plattform Lernende Systeme.
  • Heesen, J. et al. (2022). Privatheit, Ethik und demokratische Selbstregulierung in einer digitalen Gesellschaft. In A. Roßnagel, M. Friedewald (Hg), Die Zukunft von Privatheit und Selbstbestimmung: Analysen und Empfehlungen zum Schutz der Grundrechte in der digitalen Welt, Springer Vieweg: Wiesbaden: Nomos (S. 161-187), online unter https://doi.org/10.1007/978-3-658-35263-9_5
  • Heesen, J., Reinhardt, K., & Schelenz, L. (2021). Diskriminierung durch Algorithmen vermeiden: Analysen und Instrumente für eine digitale demokratische Gesellschaft. In G. Bauer, M. Kechaja, S. Engelmann, & L. Haug (Eds.), Diskriminierung und Antidiskriminierung: Beiträge aus Wissenschaft und Praxis (pp. 129–148). Transcript.
  • Hennig, M. et al. (2021). Privatheit, Autonomie und Verantwortung in digitalen Kulturen. In: M. Hennig et al. (Eds.), Autonomie und Verantwortung in digitalen Kulturen (pp. S. 7-49). Nomos.
  • Krämer, N. C., & Haferkamp, N. (2011). Online self-presentation. Balancing privacy concerns and impression construction on social networking sites. In S. Trepte & L. Reinecke (Eds.), Privacy online. Perspectives on privacy and self-disclosure in the social web (pp. 127–141). Springer. doi.org/10.1007/978-3-642-21521-6_10
  • Krämer, N. C., & Schäwel, J. (2020). Mastering the challenge of balancing self-disclosure and privacy in social media. Current Opinion in Psychology, 31, 67–71. doi.org/10.1016/j.copsyc.2019.08.003
  • Krämer, N. C., & Winter, S. (2008). Impression management 2.0: The relationship of self-esteem, extraversion, self-efficacy, and self-presentation within social networking sites. Journal of Media Psychology, 20(3), 106–116.
  • Krämer, N. C., Meier, Y., Ngo, T., Princi, E., & Meinert, J. (2021). Confidential interaction with algorithms? A systematization of new privacy challenges and reflections on theoretical conceptualizations [Paper presentation]. International Communication Association 71st virtual Conference.
  • Reinhardt, K. (2020). Between identity and ambiguity. Some conceptual considerations on diversity. Symposion, 7(2), 261–
  • Reinhardt, K. (2020). Digitaler Humanismus. Jenseits von Utopie und Dystopie. Berliner Debatte Initial, 31(1), 111–123.
  • Reinhardt, K. (2021a). Über Begriffe und ihre Folgen: „Parallelgesellschaften“. In B. Frevel (Ed.), Migration und Sicherheit in der Stadt (pp. 128–139). Lit Verlag.
  • Reinhardt, K. (2021b). Diversity-sensitive social platforms and responsibility. Some ethical considerations. Információs Társadalom, 21(2), 43–62.  
  • Stapf, I. et al. (2021). Aufwachsen in überwachten Umgebungen –Interdisziplinäre Positionen zu Privatheit und Datenschutz in Kindheit und Jugend. Nomos.
  • Stapf, I. et al. (2023). Privacy and Children’s Rights. Eds. Michael Friedewald et al., Forum Privatheit und selbstbestimmtes Leben in der digitalen Welt, White Paper, Creative Commons, doi.org/10.24406/publica-793