Eventually I’m going to have to move on and write about other news in the self-driving car world. But for the moment new facts keep appearing that raise more questions than answers about the Tesla accident.
Why Did Tesla Wait So Long?
I only learned today that the accident actually happened way back on May 7, almost two months ago.
The news didn’t break widely until Tesla announced the accident on its website on June 30.
Why did Tesla wait so long?
It’s common for institutions to release bad news right before a holiday weekend, in this case the 4th of July weekend, but Tesla even had the chance to do that over Memorial Day weekend, and still waited.
Tesla reports its financial results on the calendar-based quarter system, so June 30 was literally the last day of Q2. Was that a factor?
Did Tesla have a moral obligation to quickly inform customers that Autopilot was broken in this specific scenario?
Why Didn’t Anybody Else Break the News?
I’m astounded nobody else broke the news of this accident before now. The very first Autopilot crash and it went two months without a report.
Florida Highway Patrol was there. The truck driver was there. The victim’s family was presumably notified immediately. Since the car wreckage ran into a telephone pole, presumably the utility company was on the scene. There were probably rubberneckers, and maybe even witnesses.
It’s wild that the story stayed under the radar for so long.
Has Tesla Fixed the Problem?
Given how long Tesla took to announce the accident, I might have expected a big emphasis on Tesla’s fix and why this won’t be an issue going forward.
The lack of such a statement implies Tesla hasn’t fixed the problem yet.
If not, why not?
Is a Fix Even Possible?
There appears to be some daylight between Mobileye, the vendor that provides the computer vision hardware for Tesla, and Tesla itself.
According to StreetInsider.com, Mobileye stated that its current systems are not designed to avoid this type of crash:
We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.
According to The Verge, however, Tesla believes its systems can handle this scenario:
Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.
Of course, that just raises again the question of why the system failed in this case. To which Musk tweeted:
The Verge also reports that Florida Highway Patrol are leading the accident investigation, and it appears they are leaning toward filing criminal charges against the truck driver.
Even though the accident appears not to have been the fault of Autopilot, Tesla might need to get out ahead of this story a little more.
The combination of Tesla’s blog post, plus a wait-and-see approach to the FHP and NHTSA investigations, still leaves important questions unanswered. Some of those questions only Tesla can answer.
I’d love to see Tesla release more information once everyone gets back to work on Tuesday.