Abstract
Autonomous systems have pervaded many aspects of human activities. However, research suggests that the interaction with these machines may influence human decision-making processes. These effects raise ethical concerns in moral situations. We created an ad hoc setup to investigate the effects of system autonomy on moral decision-making and human agency in a trolley-like dilemma. In a battlefield simulation, 31 participants had to decide whether to initiate an attack depending on conflicting moral values. Our results suggest that human decision- making in morally challenging scenarios can be influenced by recommendations from autonomous systems. Interestingly, subjective judgement of responsibility decreased with higher levels of autonomy, suggesting that interaction with intelligent systems can influence both moral decision-making and sense of responsibility. Given the growing use of intelligent systems in sensitive domains, further research is essential to better understand these effects and their broader implications.
| Original language | English |
|---|---|
| Article number | 105350 |
| Journal | Acta Psychologica |
| Volume | 260 |
| DOIs | |
| Publication status | Published - Oct 2025 |
Keywords
- Agency
- Human performance
- Human-autonomous systems interaction
- Moral decision-making
- Responsibility