“Fire! Do not fire!”: Investigating the effects of autonomous systems on agency and moral decision-making

Adriana Salatino, Arthur Prével, Emilie Caspar, Salvatore Lo Bue

Research output: Contribution to journalArticlepeer-review

Abstract

Autonomous systems have pervaded many aspects of human activities. However, research suggests that the interaction with these machines may influence human decision-making processes. These effects raise ethical concerns in moral situations. We created an ad hoc setup to investigate the effects of system autonomy on moral decision-making and human agency in a trolley-like dilemma. In a battlefield simulation, 31 participants had to decide whether to initiate an attack depending on conflicting moral values. Our results suggest that human decision- making in morally challenging scenarios can be influenced by recommendations from autonomous systems. Interestingly, subjective judgement of responsibility decreased with higher levels of autonomy, suggesting that interaction with intelligent systems can influence both moral decision-making and sense of responsibility. Given the growing use of intelligent systems in sensitive domains, further research is essential to better understand these effects and their broader implications.

Original languageEnglish
Article number105350
JournalActa Psychologica
Volume260
DOIs
Publication statusPublished - Oct 2025

Keywords

  • Agency
  • Human performance
  • Human-autonomous systems interaction
  • Moral decision-making
  • Responsibility

Fingerprint

Dive into the research topics of '“Fire! Do not fire!”: Investigating the effects of autonomous systems on agency and moral decision-making'. Together they form a unique fingerprint.

Cite this