
CNET has a mostly speculative article reporting that in certain cases, Google is training it’s self-driving software to behave more like human drivers.
But which rules of the road is Google prepared to break and which ones will be all too much for its righteous soul? It will now cross double-yellow lines to avoid a car that’s, say, double-parked and blocking its path.
This is an interesting variant on the Turing Test. And do we even want self-driving cars to pass?
After all, to drive in a manner indistinguishable from a human probably means allowing for some unnecessary probability of fatal accident.
There is a wide range of driving ability among humans. If self-driving cars are indistinguishable from the best human drivers, how much less safe are they than if they are programmed to drive perfectly?
Originally published at www.davidincalifornia.com on September 29, 2015.
