
Is it ethical to program a robot to break the law?
Is it good business practice?
It seems like these questions will come up a lot in the future of self-driving cars. In fact, I bet they already have arisen, although I don’t know if any specific cases and how they’ve been handled.
The simplest case might be, is it legal to program a car to exceed the speed limit by 1 mph?
Or what about executing a three-point turn in the middle of an alley blocked off by a delivery truck?
It seems like a vehicle that perfectly followed all traffic laws might work great most of the time, but would occasionally come to a dead stop for hours, or days, on end, until an obstruction cleared.
It’s not clear to me how legislators and police should handle these situations, either.