Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents

  • Federico Cassioli International research center for Cognitive Applied Neuroscience (IrcCAN), Università Cattolica del Sacro Cuore, Milan, Italy. Research Unit in Affective and Social Neuroscience, Department of Psychology, Università Cattolica del Sacro Cuore, Milan, Italy.
  • Laura Angioletti | laura.angioletti1@unicatt.it International research center for Cognitive Applied Neuroscience (IrcCAN), Università Cattolica del Sacro Cuore, Milan, Italy. Research Unit in Affective and Social Neuroscience, Department of Psychology, Università Cattolica del Sacro Cuore, Milan, Italy.
  • Michela Balconi International research center for Cognitive Applied Neuroscience (IrcCAN), Università Cattolica del Sacro Cuore, Milan, Italy. Research Unit in Affective and Social Neuroscience, Department of Psychology, Università Cattolica del Sacro Cuore, Milan, Italy.

Abstract

The diffusion of automated decision-making systems could represent a critical crossroads for the future society. Automated technology could feasibly be involved in morally-charged decisions, with major ethical consequences. In the present study, participants (n=34) took part in a task composed of moral dilemmas where the agent (human vs. machine) and the type of behavior (action vs inaction) factors were randomized. Responses in terms of evaluation of morality, the consciousness, responsibility, intentionality, and emotional impact of the agent’s behaviour, reaction times (RTs), and EEG (delta, theta, beta, alpha, gamma powers) data were collected. Data showed that participants apply different moral rules based on the agent. Humans are considered more moral, responsible, intentional, and conscious compared to machines. Interestingly, the evaluation of the emotional impact derived from the moral behavior was perceived as more severe for humans, with decreased RTs. For EEG data, increased gamma power was detected when subjects were evaluating the intentionality and the emotional impact of machines, compared to humans. Higher beta power in the frontal and fronto-central regions was detected for the evaluation of the machine’s derived emotional impact. Moreover, a right temporal activation was found when judging the emotional impact caused by humans. Lastly, a generalized alpha desynchronization occurred in the left occipital area, when subjects evaluated the responsibility derived from inaction behaviors. Present results provided evidence for the existence of different norms when judging moral behavior of machine and human agents, pointing to a possible asymmetry in moral judgment at a cognitive and emotional level.

Dimensions

Altmetric

PlumX Metrics

Downloads

Download data is not yet available.
Published
2022-12-22
Info
Issue
Section
Original Articles
Keywords:
moral dilemma, automation, human-robot ethics
Statistics
  • Abstract views: 899

  • PDF: 10
How to Cite
Cassioli, F., Angioletti, L., & Balconi, M. (2022). Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents. Medicina E Morale, 71(4), 391-411. https://doi.org/10.4081/mem.2022.1217