Elon Musk dropped a pretty big blog post on the world yesterday.
I don’t think I could do any better than pull out two quotes, and encourage you to read the whole thing.
The first master plan that I wrote 10 years ago is now in the final stages of completion. It wasn’t all that complicated and basically consisted of:
Create a low volume car, which would necessarily be expensive
Use that money to develop a medium volume car at a lower price
Use that money to create an affordable, high volume car
And…
Provide solar power. No kidding, this has literally been on our website for 10 years.
And:
So, in short, Master Plan, Part Deux is:
Create stunning solar roofs with seamlessly integrated battery storage Expand the electric vehicle product line to address all major segments Develop a self-driving capability that is 10X safer than manual via massive fleet learning Enable your car to make money for you when you aren’t using it
I assume there is some relationship between these headlines.
Tesla is now managing separate investigations by the Florida Highway Patrol, the National Highway Transportation Safety Administration, and the National Transportation Safety Board, all stemming from a single accident resulting in one fatality.
That type of scrutiny requires a lot of lawyers.
Of course, if self-driving cars prove to be far safer than human-driven vehicles, that would create an offsetting loss for plaintiffs’ attorneys.
The Autopilot learns from all the vehicles equipped with the hardware in Tesla’s fleet (~80,000 vehicles) by building high precision maps, which it refines with every passing of a vehicle, and then downloads the map sections aligned with the vehicle’s GPS to help the vehicle’s own Autopilot system navigating the location in real-time with cross-checks from the vehicle’s sensors, primarily its front-facing camera and radar.
There are privacy considerations there, which Tesla will presumably have to address, but overall I think this is a huge win for self-driving technology, and for Tesla in particular.
Eventually I’m going to have to move on and write about other news in the self-driving car world. But for the moment new facts keep appearing that raise more questions than answers about the Tesla accident.
The news didn’t break widely until Tesla announced the accident on its website on June 30.
Why did Tesla wait so long?
It’s common for institutions to release bad news right before a holiday weekend, in this case the 4th of July weekend, but Tesla even had the chance to do that over Memorial Day weekend, and still waited.
Tesla reports its financial results on the calendar-based quarter system, so June 30 was literally the last day of Q2. Was that a factor?
Did Tesla have a moral obligation to quickly inform customers that Autopilot was broken in this specific scenario?
Why Didn’t Anybody Else Break the News?
I’m astounded nobody else broke the news of this accident before now. The very first Autopilot crash and it went two months without a report.
Florida Highway Patrol was there. The truck driver was there. The victim’s family was presumably notified immediately. Since the car wreckage ran into a telephone pole, presumably the utility company was on the scene. There were probably rubberneckers, and maybe even witnesses.
It’s wild that the story stayed under the radar for so long.
Has Tesla Fixed the Problem?
Given how long Tesla took to announce the accident, I might have expected a big emphasis on Tesla’s fix and why this won’t be an issue going forward.
The lack of such a statement implies Tesla hasn’t fixed the problem yet.
If not, why not?
Is a Fix Even Possible?
There appears to be some daylight between Mobileye, the vendor that provides the computer vision hardware for Tesla, and Tesla itself.
We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.
Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.
Of course, that just raises again the question of why the system failed in this case. To which Musk tweeted:
Radar tunes out what looks like an overhead road sign to avoid false braking events
The Verge also reports that Florida Highway Patrol are leading the accident investigation, and it appears they are leaning toward filing criminal charges against the truck driver.
Upshot
Even though the accident appears not to have been the fault of Autopilot, Tesla might need to get out ahead of this story a little more.
The combination of Tesla’s blog post, plus a wait-and-see approach to the FHP and NHTSA investigations, still leaves important questions unanswered. Some of those questions only Tesla can answer.
I’d love to see Tesla release more information once everyone gets back to work on Tuesday.
Yesterday’s post on the Tesla Autopilot crash generated a lot of comments, in large part because the Medium staff posted it to their homepage.
And the comments were really quite brilliant! They highlighted points that I either did not think of or that I didn’t explain clearly.
Flying and Boating
Diane Torrance and Jay Conne compared Tesla Autopilot their experiences are airplane pilots and boat captains, respectively. Those are transportation modes in which some version of autopilot is used extensively, and I expect there is a lot for us to learn by studying what people have learned in those fields.
In fact, the podcast 99% Invisible did podcast on the risks and rewards of airplane autopilot last year. The podcast is called “Children of the Magenta”. The reference is to the magenta line that airplane pilots track while on autopilot.
Fault
Clifford Goudey made a good point about fault and liability:
The truck driver needed to wait for a better opportunity to make the left turn and is burdened to yield to oncoming traffic.
However, Duff Bailey countered with a point about reality on the road:
a left turning driver must yield to all traffic before crossing — which is an actual impossibility under most normal traffic conditions. To prevent gridlock — human piloted cars and trucks play a subtle game of chicken with the oncoming traffic — depending on their survival instinct to slow down and not crash. When the other vehicle’s pilot system is suicidal, asleep or not working (as appears to be the case here) a crash WILL result.
In some highway situations, the speed-limit may permit a car to become visible too late for a slow turning, large truck to see the car in time to not make their turn.
Unfortunately we will never know if the driver was infact actively engaged in monitoring his environment
I think that is true, in this case. But I also think the day is coming when cars have in-cabin cameras, to monitor drive attentiveness.
Sensors vs Software
Marcus Morgenthal took exception to the idea that design flaws in Tesla’s software or sensors mitigated the driver’s culpability for the accident. And I agree.
But I would also like to clarify my previous comments on how this accident might be due to failures of sensors or software or both.
Tesla’s initial comments indicated that the failure might have been due to software, which is a much easier fix. After all, Tesla has gotten quite good at patching their software over-the-air, through customer’s home WiFi networks.
A sensor failure would be more problematic, as it might require adding an additional camera or radar to every single Tesla using Autopilot. That, in turn, might require a particularly expensive mass recall.
Tesla has done the right thing with recalls in the past, and I’m confident they would do the right thing here. But it’s worth keeping the costs and incentives in mind.
Thank You
A number of commenters wrote just to thank me for a helpful article, which was really kind.
And I say thank you back!
I really enjoy writing about self-driving cars and it’s fantastic that you enjoy reading about them!
In Memorium
I’d just like to close off by mentioning Joshua Brown, the victim, again.
By all accounts Joshua Brown loved technology, and he particularly loved his Tesla, who he nicknamed, “Tessy”.
His family has quietly indicated that they will await the accident report before they decide whether or not to bring legal action related to the crash. I think that speaks to Brown’s commitment to self-driving car technology, even though it contributed to his death.
Yesterday was a big day for self-driving cars, and not in a good way.
A Tesla Model S running the Autopilot feature suffered a horrendous crash with a tractor-trailer, which resulted in the death of the Tesla’s driver. Thankfully there were no other passengers in the Tesla, and the driver of the truck was not injured.
Facts
According to the AP, the Tesla driver was Joshua Brown of Canton, Ohio. Although the accident occurred in rural north Florida, in a small town called Williston, Brown lived in Canton, Ohio, where he owned a small telecommunications company called Nexu Innovations.
Interestingly, Brown was an 11 year veteran of the Navy SEALs, although he left the service in 2008 and this doesn’t appear to have any bearing on the accident.
Tesla’s official statement on the accident calls Brown, “a friend to Tesla and the broader EV community.”
I have no connection to Joshua Brown or his family, but since I’m going to write extensively about this accident, I’d just like to take a moment to emphasize that this was a real person, who served his country, and the loss is a tragedy, beyond whatever the result is for Tesla and autonomous vehicles.
The Accident
The accident itself sounds gruesome and like something out of Hollywood. Apparently the Tesla was traveling fast on a divided, but not controlled access, highway. Tractor-trailer made a left turn from the other side of the highway, across the side of the highway on which the Tesla was traveling.
At the time of impact, the trailer was basically perpendicular to the highway, and the Tesla crashed into it broadside.
What makes the accident especially gruesome is that the trailer was riding high off the ground. The bottom of the Tesla actually passed under the trailer and continued on for several hundred yards. The top half of the Tesla, starting at the windshield, was sheared off. While the official crash report isn’t out yet, it sounds like Brown might have been decapitated.
Autopilot
Tesla acknowledged quickly that the autopilot had been engaged at the time of the accident. According to Tesla’s statement:
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.
There have been some reports that the movie Harry Potter was playing in the vehicle even after the crash, although these reports are as-yet disputed. If true, this would suggest Brown was not following Tesla’s instructions to pay full attention to the road, even when Autopilot is engaged.
Tesla was quick to point out that:
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
NHTSA
Tesla reported that the National Highway Transportation Safety Administration has opened a preliminary investigation into the performance of the Autopilot feature.
NHTSA itself hasn’t posted any information on this accident, so it’s hard to know what exactly is going on with this, or what the consequences might be.
Interpretation
A Tesla Autopilot fatality has been the inevitable nightmare that self-driving car proponents have been worried about for months. See my earlier post on Tesla’s Risk Equation.
Given the inevitability of a first fatality, though, this is just about the least-bad scenario possible.
Fatalities
There was only one fatality in the accident, and importantly, that fatality was the Tesla driver.
Imagine instead the accident had caused the deaths of a hypothetical family of five, riding in a minivan on the opposite side of the highway. In that case, there would be a lot of questions about whether Tesla Autopilot was endangering everyone else on the road.
Omission versus Commission
This was an accident of omission, rather than of commission. That’s surely little comfort to the family of the accident victim, but I suspect its much less worrisome to the public.
If, instead, the accident had resulted from the Autopilot driving the car into a barrier, the public perception of Autopilot might have taken a much bigger hit.
Circumstances
The circumstances of the accident were unusual, although not so rare as to call them unique. Nonetheless, an Autopilot accident in moderate traffic on a six-lane Interstate highway would resonate more with the public, and be more worrisome in terms of the likelihood of future accidents.
Cause
It’s impossible to know from the outside exactly why the Autopilot did not recognize the truck. Two good guesses, however, would be either the software or the sensors.
Tesla wrote that, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”
That explanation implicates the computer vision software, which might have had sufficient sensor data to recognize the truck, but failed to process that data correctly.
Another plausible explanation, however, is that the sensor might have been insufficient. The scenario of a high-clearance truck turning unexpectedly across a highway is unusual enough that maybe the Tesla’s radar or cameras weren’t looking for it.
Level 3 vs. Level 4
My first reaction was that this might be an example of the distinction between Level 3 and Level 4 autonomous systems. Autopilot is arguably a Level 3 system, or at least close, which means that the driver can cede control to the computer, but must be ready to take control back at any time.
Some companies, such as Google and Ford, are eschewing Level 3 systems, arguing that it’s unrealistic to expect the driver to be able to take control quickly. These companies are jumping straight to Level 4 systems, where the driver never has to take control.
However, the more I read, the less this accident appears to shed light on that issue. This appears to be a case of a straight miss on Autopilot, as opposed to a case when Autopilot unsuccessfully threw control back to the driver at the last minute.
Reaction
The accident has made headlines, but it doesn’t seem like there has yet been a lot of blowback on Tesla or on self-driving cars generally. Perhaps that’s due to the mitigating factors.
Tesla’s stock is actually up for the week, indicating that Wall Street doesn’t see this as a crippling blow.
It remains to be seen whether the accident results in a costly lawsuit or settlement for Tesla, or Mobileye. Perhaps not, if the driver was a big Tesla fan, although that will now be up to his family, who may feel less generously inclined.
In the absence of a public outcry, the biggest issue might be the results of the NHTSA report. If NHTSA merely makes some recommendations about improving Autopilot in certain scenarios and ensuring that drivers pay attention to the road, that will be a win.
If the NHTSA investigation causes Tesla and other companies to significantly scale back their autonomous vehicle efforts, that would be a game-changer.
One of the big dichotomies in the autonomous driving world is between companies targeting Level 3 autonomy and those jumping right to Level 4.
Basically, this is the distinction between vehicles in which the driver has to be ready to take control at any moment (Level 3) and vehicles in which the driver can safely tune out (Level 4).
This situation is made somewhat more confusing than necessary by the fact that the US National Highway Transportation Safety Administration has put out a 6-level autonomy classification chart, and the Society of Automotive Engineers has put out a 5-level chart. In both cases Level 3 has similar, but slightly distinct, definitions.
That is the context for Volvo’s ongoing criticism of Tesla’s autopilot strategy, most recently enunciated by Volvo R&D Chief Peter Mertens in The Drive.
Every time I drive (Autopilot), I’m convinced it’s trying to kill me…Anyone who moves too early is risking the entire autonomous industry.
That last part is an interesting externalities problem. For every autonomous mile driven, in any car, there is a small but non-zero chance of a fatal accident.
There is a risk that a fatal accident, particularly a rather gruesome one, might prompt regulators to clamp down on autonomous vehicle technology and research, across all manufacturers.
In that case, by launching its autonomous technology so aggressively, it is taking a risk for which it reaps the reward but only partially shares in the cost.
I’m not sure that’s an ironclad argument — after all, at some point, some automaker will have to release autonomous technology to the public. So why not Tesla, and why not now?
But a lot of people, most vocally at Volvo, are worried Tesla is doing this too early and too aggressively, and Volvo isn’t happy about the risk Tesla is foisting on the rest of the industry.
“It gives you the impression that it’s doing more than it is,” says Trent Victor, senior technical leader of crash avoidance at Volvo, in an interview with The Verge. “[Tesla’s Autopilot] is more of an unsupervised wannabe.” In other words, Tesla is trying to create an semi-autonomous car that appears to be autonomous.
Volvo has promised death-proof cars in the past.
Other related news is that Volvo will launch fully autonomous, Level 4 vehicles with 100 test drivers in Sweden next year. The feature will be called Drive Me.
Also, Elon Musk says crashes occur 50% less frequently when drivers are in autopilot mode, although it’s not clear if this counts cars that exit autopilot mode only to crash seconds later.
The Model 3, with a base price of $35,000, is Tesla’s first offering at a price point within range of most American cars. Actually, with federal and state-level tax incentives, the out-the-door price for the car might be below the $33,500 average price of a new US car.
115,000 people have already placed $1,000 reservations for the car, securing $115 million dollars in revenue for the company.
Autopilot will come standard on the Model 3, but the big selling points seem to be the electric (gasoline-free) engine and the general appeal of Tesla.
Big questions remain, including:
When will the car will actually launch? The target date is late 2017, but that could slip.
Will Tesla be able to mass-produce mass-market cars? Tesla’s Fremont, California, plant has capacity for 500,000 cars per year. But Tesla only built 50,000 cars last year. Ramping up by an order of magnitude may cause problems.
Will other automakers beat Tesla to market? The Chevy Bolt, with a similar spec sheet, is slated to launch in late 2016.
Will the Model 3 be able to maintain its $35,000 price target? The Model S pricing jumped somewhat from where it was announced initially, and the same could happen for the Model 3.
Nonetheless, yesterday was a pretty great day for Tesla Motors.
The Model 3 is a mass-market vehicle, priced at $35,000. In some locations, tax credits will lower cost to $25,000 or less.
So a big question is, will the Model 3 bring self-driving cars to the masses?
The answer would seem to be yes, given that Elon Musk’s stated timeframe of 2–3 years until self-driving cars, and the Model 3’s 20-month launch countdown.
However, some analysts wonder whether the Model 3 can include the necessary hardware and still maintain its price goal.
According to The Motley Fool:
With Model 3’s $35,000 starting price at half the starting price of Model S and well below the $80,000 starting price of Model X, Tesla may be planning to use a more advanced autopilot hardware system in its more expensive Model S and Model X.