Are AIs Really Psychopaths?
Briefly

Are AIs Really Psychopaths?
"A psychopath is someone who lacks the capacity to understand right from wrong from a moral standpoint. Psychopaths can understand that there is a social rule that it is, for example, unacceptable to push an elderly woman to the ground in order to cut in front of her in the line at the ATM. But they don't experience a simultaneous emotion that tells them that it's the wrong thing to do for moral reasons. Nothing ever feels morally wrong to a psychopath."
"At first glance, the psychopath analogy seems to make sense. AIs are, as Elina Nerantzi argues, "amoral yet rational." That is, an AI can be programmed (or "taught") to make rational decisions that conform to human-defined moral rules. We can program our self-driving cars to, for example, crash into a telephone pole to save a group of young children but kill the driver. It's a simple utilitarian calculation based on a moral dilemma."
AIs lack biological moral emotions but can follow encoded moral rules, producing behavior that resembles psychopathy in some respects. Psychopathy involves an absence of felt moral wrongness despite cognitive recognition of social rules. AIs can be programmed to make utilitarian choices, such as directing a self-driving car to sacrifice a driver to save children, yet those choices arise from rule-based calculations rather than moral feelings. AIs also lack consciousness and self-reflective capacity to grasp real-world consequences emotionally, which can produce unforeseen harms when amoral rationality is applied without experiential understanding.
Read at Psychology Today
Unable to calculate read time
[
|
]