Should your self-driving car be programmed to kill you if it means saving more strangers?

car

Should your self-driving car be programmed to kill you in order to save others? Matt Windsor asked Ameen Bargh, a bioethicist at University of Alabama at Birmingham, this soon-to-be-real-life version of the classic Trolley Question.

"Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people," [Bargh] explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former.

Deontology, on the other hand, argues that "some values are simply categorically always true," Barghi continued. "For example, murder is always wrong, and we should never do it." Going back to the trolley problem, "even if shifting the trolley will save five lives, we shouldn't do it because we would be actively killing one," Barghi said. And, despite the odds, a self-driving car shouldn't be programmed to choose to sacrifice its driver to keep others out of harm's way.

Will your self-driving car be programmed to kill you if it means saving more strangers?

Original image: "Google Self-Driving Car." smoothgroover22. CC BY-SA 2.0.