Make Them Rare or Make Them Care
Oxford University Press eBooks(2023)
Abstract
Abstract This chapter addresses a notorious ethical issue concerning the military use of artificial intelligence (AI), which may represent the most disruptive and ethically problematic emerging technology on the horizon. The chapter argues that while civilians are never spared in war, part of the process of war is the moral cost-sharing of this burden for killing. AI would subvert this burden in unjust ways if applied for lethal autonomous weapon systems (LAWS). Because LAWS lack moral agency, in particular the capacity for moral emotions, moral costs are only borne by the dead and their loved ones. The chapter argues that it is unjust insofar as those responsible for unjust harm to others ought to share those costs by outlining and applying the Moral Affect Principle. This presents a dilemma: either autonomous weaponry are designed to be capable of moral emotions, which may be a step too far even for proponents of LAWS, or the use of such weaponry needs to be limited, which may risk worsening combatant casualties and achieving strategic aims.
MoreTranslated text
Key words
rare,care
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined