International Center for Ethics in the Sciences and Humanities (IZEW)

The Role of Ethics in Shaping Robotics Development

by Sebastian Gießler & Aline Franzke

02.04.2024 · "We are here to program algorithms and build robots, so what – if I may ask- can ethics offer us?"

This was the sentiment the EGAR (Ethical and social aspects of Autonomous Robotics) team encountered at the kick-off meeting of the Baden-Württemberg Stiftung's (BWS) "Autonomous Robotics" research programme in Stuttgart on 28 June 2023. EGAR's task as an accompanying project of the BWS is to provide ethical and social scientific expertise for various robotics development projects. The audience was predominantly made up of researchers from computer science, engineering, and robotics. While the attitude was generally curious and open-minded, there was nevertheless a sense of irritation, that might be rooted in the differences between disciplines, which will be the starting point of this article. Thus, the guiding question of this article is: what can applied ethics do for the field of robotics?

What is ethics of robotics

Robots are electro-mechanical machines that are equipped with a processor and use sensors to collect information about their environment. Effectors or actuators translate these signals into processes. In this way, the robot can influence its environment with its body (Misselhorn, 2013, p. 43). Ethics of robotics is a sub-field of machine ethics. This area of research works as an interdisciplinary field at the interface between philosophy, computer science, and robotics (Misselhorn, 2018, p. 1). Roughly speaking, it operates on four different levels:

a) meta-questions, such as those pertaining to which images and ideas underlie robot development (Hampton, 2015);

b) questions of design and robot development.

c) data-ethical considerations regarding the processes and the extent to which values are affected;

d) ethical questions emerging from the interaction between robots and humans (Goodrich et al, 2008).

Intermediating position

The interdisciplinary nature of ethics of robotics allows one to draw on a pool of different methods for ethical evaluation and analysis. The positions of developers, policymakers, and civil society stakeholders can be contrasted and evaluated to derive well-founded propositions for further action. The EGAR project is planning dialogue formats with museums, libraries and public discussions that will be used to initiate a process of reaching mutual understanding.

The challenge and benefit of making implicit values explicit

As in other scientific disciplines, autonomous systems are technical and scientific artefacts that are shaped by normative and epistemic values. Robotics development is just as much a question of scientific principles as it is of both conscious and unconscious design decisions. In philosophy of science, three phases of research and development processes can be distinguished: The context of discovery, the context of justification, and the context of use. Value judgements can influence each of these three phases. That means in other words that we must ask firstly what kind of problem is understood to be solved technologically? Then, the theoretical grounds of how this problem is addressed, needs to be evaluated. And lastly, what are unintended consequences of this application?  In short, the aim of ethics of robotics is, to make these values in each phase of research explicit and to reveal and evaluate their influence on the actual robotics development.

It is crucial to explicate the intended technical development goals and the intended field of application of robots. This opens two analytical dimensions: First, questions of research ethics, and second, questions of technology ethics that explore both the intended and unintended consequences of an application. For example, what is the difference between the advancement of an assembly (e.g. sensors and robot arms), which is often understood as a technical solution to a specific technical problem, and the intended purpose of a robot? The use of robots in care is one case where a technical solution to a social problem is being sought. What values do engineers bring to the project, what values do the people who use the system have, and to what extent are there conflicts of objectives?

Ethics of robotics can help to make contradictory objectives visible, explicate ethical and social values, and make deliberate judgments and render these trade-offs evident. It helps to classify these various assumptions and conflicts and provide well-founded recommendations. This benefits both engineers, who will consequently achieve a more fine-grained understanding of their own field of work, and users, as their position is explicitly recognised.  

On trust and mistrust in robots
This translates into a practical application of the ethics of robotics. Even if values such as trust are difficult to operationalise (cf. Reinhardt 2023), ethics of robotics can promote trust and user acceptance. Ethical and social considerations play a major role in the introduction of autonomous systems. By integrating ethical principles into the design and development of robots, developers can build robots that are in line with ethical and social values and can thus increase user acceptance. This can promote trust between humans and robots and favour their use in various areas such as healthcare, elderly care, and education.

It is a small step from the issue of user trust to those of liability and responsibility. Technical systems are strongly rooted in legal standards: Should a car manufacturer be responsible for ensuring that a vehicle is technically safe to drive? Drivers are responsible for driving the car safely on the road, and the government or local authority is responsible for ensuring that the roads are in good condition and subject to regular technical inspections. Are there comparable structures in the field of autonomous robotics, and if not, should there be? Is a manufacturer merely responsible for the technical design? Who should take care of regulatory frameworks, and what is the role of users, who may have little, or no understanding of the technology used? Who is accountable and liable if a person is injured? Ethics of robotics can uncover possible responsibility gaps, categorise liability issues, and make normative judgements about the extent to which risk outweighs benefit. This is a direct benefit for technology developers.

Robot politics

There is a direct path leading from the bottom-level of technology development to the top-level: the legal and political framework conditions for the use of robots. Ethical considerations inform and influence these frameworks and often precede them. Examples include the opinions of the German Ethics Council on the issue of Artificial Intelligence (Deutscher Ethikrat 2023) and the European Commission's High-Level Expert Group on Artificial Intelligence (AI HLEG 2019), which play a key role in shaping national and supranational research strategies and legislative proposals, such as the German government's AI strategy and the EU AI Act. Ethics of robotics thus interacts with legal systems. Additionally, it evaluates proposed and passed laws from an ethical perspective and positions itself accordingly.

Conclusion

Public fears - but also hopes - are closely linked to the development of technologies. Ethics of robotics can open a space for reflection that allows us to look at the benefits and the general question of "why this and no other technology?" Ethics of robotics can play an important role, especially when it comes to questioning the imaginations, visions, and ideas underlying a technology.

However, ethics of robotics is not simply a helping hand which supports engineering disciplines reaching legal and ethical compliance. Even though ethics of robotics works closely with engineers and developers, it is an independent ethical field. It can reveal normative elements of robotics, identify alternatives, evaluate the resulting decision-making spaces, and make well-founded recommendations.

Ethics of robotics can assist in navigating ethical and social challenges in robot development and application. Working with state-of-the-art technology enriches canonical, long-standing philosophical debates on autonomy, privacy, transparency, and fairness. As a result, it can be shown how established concepts of privacy conflict with demands for transparency in the context of the complex, modular production processes of technologies. Ethics of robotics has a key role to play here. It is neither easy nor trivial to determine which values are operationalized technically and how this could be achieved.

Ethics of robotics can anticipate impacts for users through a critical perspective and contribute to improving robotics development and robots i.e., minimising data collection or training AI models on a diverse data set. Large parts of day-to-day life are based on the use of technology, and technology is increasingly forming the fundamental infrastructure of social life. How this technology is built and implemented is central to its application in social environments. It is by no means necessary for citizens to become engineers or for all engineers to become ethicists. Nevertheless, the professional responsibility of engineers and everyone else involved in the development of technology is crucial.  Ethics of robotics as a scientific discipline has a great deal to offer in this regard.

-------------------------------------

References

AI HLEG, 2019, “High-Level Expert Group on Artificial Intelligence: Ethics Guidelines for Trustworthy AI”, European Commission, accessed: 13.07.2023.

Deutscher Ethikrat (2020) Robotik für gute Pflege. Stellungnahme. Berlin

Deutscher Ethikrat (2023): Mensch und Maschine – Herausforderungen durch Künstli-

che Intelligenz. Stellungnahme. Berlin

Goodrich, M. A., & Schultz, A. C. (2008). Human–robot interaction: a survey. Foundations and Trends® in Human–Computer Interaction, 1(3), 203-275.

Germany (2018). Artificial Intelligence Strategy. Federal Ministry of Education and Research, the Federal Ministry for Economic Affairs and Energy, and the Federal Ministry of Labour and Social Affairs. www.ki-strategie-deutschland.de/home.html

Hampton, G. J. (2015). Imagining slaves and robots in literature, film, and popular culture: reinventing yesterday's slave with tomorrow's robot. Lexington Books.

Misselhorn, Catrin. Grundfragen der Maschinenethik. 5., Durchgesehene und Überarbeitete Auflage. Reclams Universal-Bibliothek, Nr. 19583. Ditzingen: Reclam, 2022.

Reinhardt, K. (2023) Trust and trustworthiness in AI ethics. AI Ethics 3, 735–744 . https://doi.org/10.1007/s43681-022-00200-5

--------------------------------------

Short-Cut to forward this article: https://uni-tuebingen.de/de/263004