Reason reports on an interesting study that seems obvious but hadn’t occurred to me — would people buy a driverless car that was programmed to kill them, if it meant saving other people’s lives.
This is like the trolley problem, which is well-studied in philosophy, and has been often applied to driverless vehicles.
It seems pretty intuitive that if our car has to choose between staying in its lane and killing three people, or swerving and killing one pedestrian, we want the car to choose the option that minimizes loss of life.
But what if the life is our own?
What happens when there is no onlooker, but the one who gets sacrificed to save ten strangers is you as the passenger in the car? Should cars be programmed to sacrifice you? In fact, most respondents agreed that self-driving cars should be programmed that way, but a significant portion believe that manufacturers will more likely program them to save their passengers no matter the cost.
That’s an interesting take, and one where I’m not sure that stated preferences and revealed preferences will actually prove to be in alignment.
Originally published at www.davidincalifornia.com on October 29, 2015.