Independence Day

The Capitol in Richmond, Virginia, my native commonwealth.

Happy 4th of July to everyone, andparticularly to those of us living in the United States of America, who celebrate Independence Day today.

I was grasping for some self-driving car angle related to Independence Day, and I came up with something, although it’s a stretch.

One of my family traditions is to celebrate the 4th of July by reading the Declaration of Independence.

Thomas Jefferson’s triumphant document is known for its soaring rhetoric at the opening:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. — That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, — That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.

And the close:

We, therefore, the Representatives of the united States of America, in General Congress, Assembled, appealing to the Supreme Judge of the world for the rectitude of our intentions, do, in the Name, and by Authority of the good People of these Colonies, solemnly publish and declare, That these United Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown, and that all political connection between them and the State of Great Britain, is and ought to be totally dissolved;

Less memorable is the middle of the document, which consists of a list of accusations against British King George III.

In particular, this complaint stands out:

He has called together legislative bodies at places unusual, uncomfortable, and distant from the depository of their public Records, for the sole purpose of fatiguing them into compliance with his measures.

A great irony of the Declaration of Independence is that many of the complaints against George III seem to have gone uncorrected after independence, including this one.

Today, 33 U.S. states locate their capitals outside of their largest city, including my home state of California and my native commonwealth of Virginia.

The distance between the average resident of Illinois and the state capital in Springfield, or between a denizen of Florida and Tallahassee, is measured in hours, or even fractions of a day.

My dream for self-driving cars is to eventually move us beyond the 60 mph ceiling that has constrained automobiles for a century. One day, the seven hour car ride between Las Vegas and Carson City might shrink to an hour and half.

And we’ll be a small step closer to realizing Jefferson’s dream.

More Questions About the Tesla Accident

Eventually I’m going to have to move on and write about other news in the self-driving car world. But for the moment new facts keep appearing that raise more questions than answers about the Tesla accident.

Why Did Tesla Wait So Long?

I only learned today that the accident actually happened way back on May 7, almost two months ago.

The news didn’t break widely until Tesla announced the accident on its website on June 30.

Why did Tesla wait so long?

It’s common for institutions to release bad news right before a holiday weekend, in this case the 4th of July weekend, but Tesla even had the chance to do that over Memorial Day weekend, and still waited.

Tesla reports its financial results on the calendar-based quarter system, so June 30 was literally the last day of Q2. Was that a factor?

Did Tesla have a moral obligation to quickly inform customers that Autopilot was broken in this specific scenario?

Why Didn’t Anybody Else Break the News?

I’m astounded nobody else broke the news of this accident before now. The very first Autopilot crash and it went two months without a report.

Florida Highway Patrol was there. The truck driver was there. The victim’s family was presumably notified immediately. Since the car wreckage ran into a telephone pole, presumably the utility company was on the scene. There were probably rubberneckers, and maybe even witnesses.

It’s wild that the story stayed under the radar for so long.

Has Tesla Fixed the Problem?

Given how long Tesla took to announce the accident, I might have expected a big emphasis on Tesla’s fix and why this won’t be an issue going forward.

The lack of such a statement implies Tesla hasn’t fixed the problem yet.

If not, why not?

Is a Fix Even Possible?

There appears to be some daylight between Mobileye, the vendor that provides the computer vision hardware for Tesla, and Tesla itself.

According to StreetInsider.com, Mobileye stated that its current systems are not designed to avoid this type of crash:

We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.

According to The Verge, however, Tesla believes its systems can handle this scenario:

Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.

Of course, that just raises again the question of why the system failed in this case. To which Musk tweeted:

Criminal Charges

The Verge also reports that Florida Highway Patrol are leading the accident investigation, and it appears they are leaning toward filing criminal charges against the truck driver.

Upshot

Even though the accident appears not to have been the fault of Autopilot, Tesla might need to get out ahead of this story a little more.

The combination of Tesla’s blog post, plus a wait-and-see approach to the FHP and NHTSA investigations, still leaves important questions unanswered. Some of those questions only Tesla can answer.

I’d love to see Tesla release more information once everyone gets back to work on Tuesday.

Comments on the Tesla Autopilot Crash

Yesterday’s post on the Tesla Autopilot crash generated a lot of comments, in large part because the Medium staff posted it to their homepage.

And the comments were really quite brilliant! They highlighted points that I either did not think of or that I didn’t explain clearly.

Flying and Boating

Diane Torrance and Jay Conne compared Tesla Autopilot their experiences are airplane pilots and boat captains, respectively. Those are transportation modes in which some version of autopilot is used extensively, and I expect there is a lot for us to learn by studying what people have learned in those fields.

In fact, the podcast 99% Invisible did podcast on the risks and rewards of airplane autopilot last year. The podcast is called “Children of the Magenta”. The reference is to the magenta line that airplane pilots track while on autopilot.

Fault

Clifford Goudey made a good point about fault and liability:

The truck driver needed to wait for a better opportunity to make the left turn and is burdened to yield to oncoming traffic.

However, Duff Bailey countered with a point about reality on the road:

a left turning driver must yield to all traffic before crossing — which is an actual impossibility under most normal traffic conditions. To prevent gridlock — human piloted cars and trucks play a subtle game of chicken with the oncoming traffic — depending on their survival instinct to slow down and not crash. When the other vehicle’s pilot system is suicidal, asleep or not working (as appears to be the case here) a crash WILL result.

Jay Coone added:

In some highway situations, the speed-limit may permit a car to become visible too late for a slow turning, large truck to see the car in time to not make their turn.

Driver Engagement

Brad Slusher lamented:

Unfortunately we will never know if the driver was infact actively engaged in monitoring his environment

I think that is true, in this case. But I also think the day is coming when cars have in-cabin cameras, to monitor drive attentiveness.

Sensors vs Software

Marcus Morgenthal took exception to the idea that design flaws in Tesla’s software or sensors mitigated the driver’s culpability for the accident. And I agree.

But I would also like to clarify my previous comments on how this accident might be due to failures of sensors or software or both.

Tesla’s initial comments indicated that the failure might have been due to software, which is a much easier fix. After all, Tesla has gotten quite good at patching their software over-the-air, through customer’s home WiFi networks.

A sensor failure would be more problematic, as it might require adding an additional camera or radar to every single Tesla using Autopilot. That, in turn, might require a particularly expensive mass recall.

Tesla has done the right thing with recalls in the past, and I’m confident they would do the right thing here. But it’s worth keeping the costs and incentives in mind.

Thank You

A number of commenters wrote just to thank me for a helpful article, which was really kind.

And I say thank you back!

I really enjoy writing about self-driving cars and it’s fantastic that you enjoy reading about them!

In Memorium

I’d just like to close off by mentioning Joshua Brown, the victim, again.

By all accounts Joshua Brown loved technology, and he particularly loved his Tesla, who he nicknamed, “Tessy”.

His family has quietly indicated that they will await the accident report before they decide whether or not to bring legal action related to the crash. I think that speaks to Brown’s commitment to self-driving car technology, even though it contributed to his death.

Tesla’s Autopilot Crash

Yesterday was a big day for self-driving cars, and not in a good way.

A Tesla Model S running the Autopilot feature suffered a horrendous crash with a tractor-trailer, which resulted in the death of the Tesla’s driver. Thankfully there were no other passengers in the Tesla, and the driver of the truck was not injured.

Facts

According to the AP, the Tesla driver was Joshua Brown of Canton, Ohio. Although the accident occurred in rural north Florida, in a small town called Williston, Brown lived in Canton, Ohio, where he owned a small telecommunications company called Nexu Innovations.

Interestingly, Brown was an 11 year veteran of the Navy SEALs, although he left the service in 2008 and this doesn’t appear to have any bearing on the accident.

Tesla’s official statement on the accident calls Brown, “a friend to Tesla and the broader EV community.”

I have no connection to Joshua Brown or his family, but since I’m going to write extensively about this accident, I’d just like to take a moment to emphasize that this was a real person, who served his country, and the loss is a tragedy, beyond whatever the result is for Tesla and autonomous vehicles.

The Accident

The accident itself sounds gruesome and like something out of Hollywood. Apparently the Tesla was traveling fast on a divided, but not controlled access, highway. Tractor-trailer made a left turn from the other side of the highway, across the side of the highway on which the Tesla was traveling.

At the time of impact, the trailer was basically perpendicular to the highway, and the Tesla crashed into it broadside.

What makes the accident especially gruesome is that the trailer was riding high off the ground. The bottom of the Tesla actually passed under the trailer and continued on for several hundred yards. The top half of the Tesla, starting at the windshield, was sheared off. While the official crash report isn’t out yet, it sounds like Brown might have been decapitated.

Autopilot

Tesla acknowledged quickly that the autopilot had been engaged at the time of the accident. According to Tesla’s statement:

Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

There have been some reports that the movie Harry Potter was playing in the vehicle even after the crash, although these reports are as-yet disputed. If true, this would suggest Brown was not following Tesla’s instructions to pay full attention to the road, even when Autopilot is engaged.

Tesla was quick to point out that:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

NHTSA

Tesla reported that the National Highway Transportation Safety Administration has opened a preliminary investigation into the performance of the Autopilot feature.

NHTSA itself hasn’t posted any information on this accident, so it’s hard to know what exactly is going on with this, or what the consequences might be.

Interpretation

A Tesla Autopilot fatality has been the inevitable nightmare that self-driving car proponents have been worried about for months. See my earlier post on Tesla’s Risk Equation.

Given the inevitability of a first fatality, though, this is just about the least-bad scenario possible.

Fatalities

There was only one fatality in the accident, and importantly, that fatality was the Tesla driver.

Imagine instead the accident had caused the deaths of a hypothetical family of five, riding in a minivan on the opposite side of the highway. In that case, there would be a lot of questions about whether Tesla Autopilot was endangering everyone else on the road.

Omission versus Commission

This was an accident of omission, rather than of commission. That’s surely little comfort to the family of the accident victim, but I suspect its much less worrisome to the public.

If, instead, the accident had resulted from the Autopilot driving the car into a barrier, the public perception of Autopilot might have taken a much bigger hit.

Circumstances

The circumstances of the accident were unusual, although not so rare as to call them unique. Nonetheless, an Autopilot accident in moderate traffic on a six-lane Interstate highway would resonate more with the public, and be more worrisome in terms of the likelihood of future accidents.

Cause

It’s impossible to know from the outside exactly why the Autopilot did not recognize the truck. Two good guesses, however, would be either the software or the sensors.

Tesla wrote that, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

That explanation implicates the computer vision software, which might have had sufficient sensor data to recognize the truck, but failed to process that data correctly.

Another plausible explanation, however, is that the sensor might have been insufficient. The scenario of a high-clearance truck turning unexpectedly across a highway is unusual enough that maybe the Tesla’s radar or cameras weren’t looking for it.

Level 3 vs. Level 4

My first reaction was that this might be an example of the distinction between Level 3 and Level 4 autonomous systems. Autopilot is arguably a Level 3 system, or at least close, which means that the driver can cede control to the computer, but must be ready to take control back at any time.

Some companies, such as Google and Ford, are eschewing Level 3 systems, arguing that it’s unrealistic to expect the driver to be able to take control quickly. These companies are jumping straight to Level 4 systems, where the driver never has to take control.

However, the more I read, the less this accident appears to shed light on that issue. This appears to be a case of a straight miss on Autopilot, as opposed to a case when Autopilot unsuccessfully threw control back to the driver at the last minute.

Reaction

The accident has made headlines, but it doesn’t seem like there has yet been a lot of blowback on Tesla or on self-driving cars generally. Perhaps that’s due to the mitigating factors.

Tesla’s stock is actually up for the week, indicating that Wall Street doesn’t see this as a crippling blow.

It remains to be seen whether the accident results in a costly lawsuit or settlement for Tesla, or Mobileye. Perhaps not, if the driver was a big Tesla fan, although that will now be up to his family, who may feel less generously inclined.

In the absence of a public outcry, the biggest issue might be the results of the NHTSA report. If NHTSA merely makes some recommendations about improving Autopilot in certain scenarios and ensuring that drivers pay attention to the road, that will be a win.

If the NHTSA investigation causes Tesla and other companies to significantly scale back their autonomous vehicle efforts, that would be a game-changer.

BMW+Intel+Mobileye

News is leaking that tomorrow BMW will announce a three-way partnership with Intel and Mobileye.

The partnership is billed as a move to catch Tesla, but to me its interesting mostly for what’s missing.

  1. There does not appear to be a Tier 1 supplier. Normally I’d expect a company like Bosch or Continental to be involved.
  2. NVIDIA isn’t named as a partner. Intel does have its own GPUs, but Intel is primarily known as a CPU specialist.

It’s an interesting set of choices for BMW, and may reflect where different suppliers are in their readiness to actually put parts in vehicles.

Driving While Brown

My wife and I love to pick up a free Palo Alto Daily Post and go through the police blotter. Unfortunately, the Post doesn’t publish the blotter online, so you have to get a paper edition to check it out.

The paper makes an art of finding the funniest police reports — our favorite was when police were called to investigate a report that a baby had been dumped in a trash can at the Walmart. Police determined the baby was a burrito.

There is an aspect of the blotter that is reliably disturbing, however, and I’m never sure if the Post publishes it intentionally or not. There is line after line of people with hispanic-sounding names who have been pulled over and cited or arrested on suspended licenses or outstanding warrants or other non-moving offenses.

From the outside, it sure looks like these people are being pulled over for driving while brown.

Police have a lot of discretion in determining who to stop. I imagine a police officer can usually find a legitimate reason to stop any given vehicle that drives by.

One hope for the future of self-driving cars is that this will become a lot more rare. If vehicles are programmed by the manufacturer to obey all traffic laws, there will still be reasons to pull over vehicles, but it will be harder to justify.

And hopefully driving while brown will become a crime of the past.

Driving Might Become the New Biking

Recently I’ve had a few discussions with people who are nervous about self-driving cars, mostly because they find driving fun. They’re worried that we’re entering a brave new world where people won’t be allowed to drive for fun anymore.

I think this is a legitimate concern, but my response is that driving will become like biking.

Biking today is primarily a leisure activity that people do for fun. Except in a certain uncommon (and usually urban) instances, biking is rarely the most efficient or fastest way to transport yourself.

Once self-driving cars become common, I expect to see much the same thing. We might see certain roads designated only for human-driven cars, just like many paths are designated specifically for bikes today.

And it won’t shock me if we see a replay of some of the cyclist vs. driver road rage in the form of human drivers vs human passengers in self-driving cars, all trying to use the same road.

My model for thinking about how human-driven cars will map onto the self-driving road system is to think about how bikes map onto the current human-driven road system.

There are a lot of bike-only paths, often along scenic routes. Outside of those routes, cyclists will often use slower, smaller, residential streets for biking. The instances in which cyclists need to use main commuting thoroughfares are the situations in which bike-car conflict is the greatest.

So I can imagine scenic roads, like US 1 in California, or Skyline Drive in Virginia, being set aside specifically for human drivers. This might be especially true if self-driving cars ultimately attain speeds far beyond what human drivers can safely handle today.

Big interstate highways, though, might become the domain of computer-driven cars traveling hundreds of miles per hour.

Brexit and Autonomous Vehicles

The news of the day is Brexit, and there is a lot more to that than autonomous vehicles, by a long shot.

But there is a small autonomous vehicle angle, which is that the UK, and London in particular, have been positioning themselves as leaders in the autonomous vehicle race.

Nobody is quite sure what the results of Brexit will be, or even if it will happen at all.

But my guess is that the effects will be felt hardest by smaller, startup firms. Those are the companies that won’t have the legions of lawyers to translate and negotiate new cross-border agreements with the rest of the EU.

Brexit may even be a boon to larger, foreign companies, like US and Asian automakers, whose products will now face a little less competition in Europe and the rest of the world.

How to Land an Autonomous Vehicle Job: Networking

I wrote earlier that my formula for landing a job in autonomous vehicles had three parts:

  1. Coursework
  2. Projects
  3. Networking

The first two parts consist of developing skills. The third part, networking, requires selling those skills to the world.

Like all good salesmen, I used a CRM tool, although in this case it was just a spreadsheet. And I filled in that spreadsheet with all the different companies I was excited to talk with.

I didn’t have this vocabulary at the time, but essentially the potential employers in this space break down into segments.

Transportation-as-a-Service

  • Uber
  • Lyft

OEMs

  • Ford
  • Google
  • Tesla
  • Toyota
  • Mercedes
  • etc.

Tier 1 Suppliers

  • Bosch
  • Delphi
  • Continental
  • etc.

Tier 2 Suppliers

  • NVIDIA
  • Mobileye
  • Velodyne
  • etc.

With limited exceptions, these companies base their autonomous vehicle teams in three places: Michigan, Silicon Valley, and Germany. So it’s worth considering your willingness to live in those places.

The next step is to scan the careers pages of these companies and apply for the positions that interest you. These cold applications rarely lead to jobs on their own, but once you get to the right person within the company, they will usually ask whether you have applied for jobs via the website. It’s helpful to be able to answer that question yes right off the bat.

Finding the right person to contact is often laborious, but this is the key step. I leveraged my own network, which was great, but my most promising leads, including at Ford, actually came from cold emails. I found recruiters or hiring managers on LinkedIn and then sent messages. I would follow up on these messages two or three times each before giving up.

The final step was building my CV to focus on my AV work. Since my previous professional experience was less relevant, I pushed that to a second page and filled the first page of my CV with all of the courses and projects and keywords that I wanted autonomous vehicle people to see.

After that, if all goes well, will come interviews, both formal and informal. Maybe some programming challenges.

Interviews are more about luck than predicting employee success (that’s literally true, according to personnel selection research), so it helps to have a lot of interviews and hopefully you’ll get lucky at least once.

Fortunately, I did 🙂