Autonomous vehicle moral dilemmas matter less for the particular outcomes of potential accidents than for their role in defining the values of the society we wish to live in. Different approaches have been suggested to determine the ethical settings that autonomous vehicles should be implemented in and identify the legitimate agents for making such decisions. Most of these, however, fail on theoretical grounds, facing severe issues related to moral justifications and compliance to the law, or on…
Read moreAutonomous vehicle moral dilemmas matter less for the particular outcomes of potential accidents than for their role in defining the values of the society we wish to live in. Different approaches have been suggested to determine the ethical settings that autonomous vehicles should be implemented in and identify the legitimate agents for making such decisions. Most of these, however, fail on theoretical grounds, facing severe issues related to moral justifications and compliance to the law, or on practical grounds, being insufficiently universal, action-guiding, or technically viable to be implemented. The analogy with the “trolley problem” has been extensively discussed. However, researchers have rarely tried to adapt this framework to autonomous vehicle cases or investigate how it could be used to address these issues. In doing so, this paper aims to answer the two key problems of autonomous vehicle dilemmas. With regards to the decision-maker, it rejects autonomous vehicle users’ choice-based models, showing the absurdity of both switch of control and adaptative preferences and arguing for common legislator-determined ethical settings. With regards to the decisions themselves, it criticizes both utilitarian views and those based on individuals’ criteria to suggest a deontologist rights-based approach. This allows for the defence of a morally coherent, regulatory compliant, explainable, and easily implementable framework capable of addressing all autonomous vehicle moral dilemma scenarios present in the literature.