Moral Psychology of Nursing Robots – Humans Dislike Violations of Patient Autonomy but Like Robots Disobeying Orders
Artificial intelligences (AIs) are widely used in tasks ranging from transportation and healthcare to military. Many tasks carried out by autonomous AIs have consequences for human well-being, but it is still unclear how people would prefer them to act in ethically difficult situations. In six studies with data from two cultures (five quantitative experiments, n = 1569, and a qualitative anthropological field study, n = 30), we presented people with hypothetical situations where a human or an advanced robot nurse is ordered to forcefully medicate an unwilling patient. We measured moral acceptance, perceived trust, and allocation of responsibility relating to the nurse’s decision of either following orders to forcefully medicate the patient, or disregarding orders to protect the patient’s autonomy. Our participants were aversive to robot nurses who forcefully medicated the patient, and preferred robot nurses who respected patient autonomy by disobeying orders. Under certain conditions, the decision to respect patient autonomy was more acceptable for robot nurses than for human nurses. Thus, our results suggest that people prefer robots that are capable of disobeying orders in favor of abstract moral principles such as valuing personal autonomy. These findings were relatively robust against manipulating the nurse’s perceived reputation and character, and whether or not the patient lived or died afterwards. We also found that moral judgment is distinct from evaluations of trust and responsibility. In general, our participants did not trust robot nurses or hold them responsible for their actions; on the other hand human nurses who forcefully medicated a patient were morally condemned but also trusted. It seems that Moral Psychology of Robotics is a new and increasingly relevant sub-field of moral psychology that requires extensive attention.