Testing and Validating Autonomous Vehicles

University of Michigan researchers just released an exciting but vague (that’s the second time I’ve used that formulation recently) whitepaper on testing and validation for autonomous vehicles.

Testing is one of many open challenges in autonomous vehicle development. There’s no clear consensus on exactly how much testing needs to be done, and how to do it, and how safe is safe enough.

Last year, RAND issues a report estimating that it would be basically impossible to empirically verify the safety of self-driving cars on any reasonable timeframe.

Ding Zhao and Huei Peng, from Michigan, claim to have found a way to reduce by 99.9% the billion-plus miles necessary. The four keys are:

  • Naturalistic Field Operational Tests
  • Test Matrix
  • Worst Case Scenario
  • Monte Carlo Simulation

The paper is light on details, but the approach seems to boil down to: drive dangerous situations again and again on a test track, instead of waiting for the dangerous situations to occur on the road, because that could take forever.

And that all seems smart enough. It’s like practicing three point shots, instead of just mid-range jumpers. Or building new exciting software projects as a way to learn a new computer language, instead of just maintaining legacy code.

But it’s less clear how to get from essentially “focused practice” to “the car is safe enough”. Perhaps another paper is forthcoming that makes that leap.

Mobility and Self-Driving Cars and Ford

Yesterday, Ford parted ways with CEO Mark Fields, and promoted Jim Hackett to the top spot.

Hackett is an interesting person for a lot of reasons. One reason is Hackett’s one year run as head of Ford Smart Mobility, LLC, immediately prior to now.

I’ve seen news outlets reporting that Hackett was head of Ford’s autonomous vehicle program, but that’s not quite right.

Ford Smart Mobility is a mobility-focused subsidiary that looked at combining everything from bikes to van shuttles to trains to autonomous vehicles into a seamless mobility service.

The LLC is more of a small standalone business unit, whereas Ford’s autonomous vehicle team, headed by Randy Visintainer, is housed within Ford Motor Company proper.

This distinction raises the question of which is the key market — self-driving cars, or mobility as a service?

Traditional mobility has been delineated by different companies owning different modes of transportation — the railway company is different than the car company, which is different than the bike or bus company.

Will technology change that in the future? Or is the future pretty much about self-driving taxis, with people using bikes and trains and planes more or less as they always have?

I’m not quite sure. Certainly, as people give up their personal cars and rely on ridesharing, there perspective on other forms of transportation changes. If your self-driving taxi company can’t take you 200 miles to your weekend getaway, and you don’t have your own car, maybe you need a seamless solution. Or maybe you just call Hertz.

It’s not obvious that Ford or Hackett will bank on broad mobility over pure self-driving cars, but it seems like a possibility.

Mapping the Autonomous Vehicle Industry

Keeping up with all of the partnerships, acquisitions, relationships, and startups in the autonomous vehicle world can be overwhelming!

Two different groups have recently published visual guides to the autonomous vehicle industry that I found helpful.

David Baker at The San Francisco Chronicle published the diagram above, which focuses on established automotive manufacturers and suppliers. The article also has great explanations of each segment of the graph.

At Comet Labs, Taylor Stewart put together an impressive taxonomy of 263 startups in the autonomous vehicle world, including many that I’d never noticed.

Enjoy!

GM Quits India — Are Car Sales Next?

GM is going to stop selling cars in India, and South Africa, by the end of 2017. A few months ago, GM sold off most of its European sales operation, too.

Maybe this is all just about profit margins and exiting underperforming markets to focus on stronger ones. Ford is reducing 10% of its workforce, for similar reasons.

But it’s also easy to imagine these are early steps in a longer-term disruption of the automotive industry — a change from automotive sales to mobility.

To be sure, consumer automotive sales will continue for a long time, especially in rural areas. But the bright future of automotive manufacturers probably lies in fleet sales to ride sharing companies, or maybe even becoming ride sharing companies themselves. Pulling back from foreign markets might be part of that trend.

Autonomous Triads

Last summer, BMW and Intel and Mobileye formed a three-way partnership to develop autonomous vehicles.

Last month, Mercedes-Benz and Bosch and NVIDIA joined forces for a similar partnership.

Triads seem to be a thing in self-driving cars.

Except now that Intel has acquired Mobileye, that triad has turned into a diad.

Not to worry, however, Delphi has joined the BMW-Intel partnership, so the rule of three is in force and all is right with the world.

What does all this really mean on the ground?

Scott McNealy once tweeted that his favorite type of partnership is a purchase order.

While I love purchase orders, there is a lot of value in more open-ended partnerships, as well. But there’s also less direct responsibility.

A few months ago Intel dealt with this by sending Mobileye a $15 billion PO for the entire company.

I’m interested to see how the latest traids of partners go.

Udacity Students on Deep Learning and Jobs

Want to get a job working on self-driving cars? Read on.

How I Landed My Dream Job Working On Self-driving Cars

Galen Ballew

The guiding star of the Udacity Self-Driving Car Nanodegree Program is to prepare students for jobs working on autonomous vehicles. So we were excited that Galen found his dream job working on autonomous vehicles for HERE in Boulder, Colorado. He also gives lots of credit to the Udacity, which is generous of him 🙂

“The private Slack channel for students is filled with a tangible excitement. I’ve never been a part of a such a large student body, let alone a student body that is committed to the success of every student (no grading curve here). Between Slack, the dedicated forums, and your own private mentor, there is no reason to be stuck on a problem — there are so many people willing to help answer your questions. Instead, you can focus on finding your own way to improve the foundations of the projects.”

Self-driving Cars — Deep neural networks and convolutional neural networks applied to clone driving behavior

Ricardo Zuccolo

Ricardo provides a thorough rundown of his Behavioral Cloning project, which runs on both simulators. He synthesized and built on the insights of earlier Udacity students:

“My first step was to evaluate the driver log steering histograms for 100 bins and do all required transformation, drop and augmentation to balance it. Here I followed the same methodology as in the well explained pre-processing from Mez Gebre, thanks Mez!”

Behavioral Cloning: Tiny Mistake Cost Me 15 days

Yazeed Alrubyli

Yazeed was struggling with the Behavioral Cloning Project when he realized he mixed up his colorspaces. Students on Slack pointed out that this might have been because I mixed up the colorspaces in some demo code. Oops!

“I spend about 15 days training my network over and over in the third project of the Self-Driving Cars Engineer NanoDegree by Udacity, it drives me crazy because it can’t keep itself on the road. I reviewed the code hundreds of times and nothing wrong with it. Telling you the truth, when I almost gave up, I read an article that has nothing to do with this project. It mentions that OpenCV read images as BGR (Blue, Green, Red) and guess what, the drive.py uses RGB. What The Hell I’m doing for 2 weeks !!!”

What i have learned from the first term of Udacity Self driving car nanodegree program.

Hadi N. Abu-Snineh

Hadi went back and reviewed everything he learned during Term 1. He included all of his project videos, which are awesome!

“The last and fifth project of the first term is to write a program to detect vehicles by drawing a bounding box on each detected vehicle. the project is done using a Support Vector Machine SVM that is a kind of classifier that is used to classify and differentiate between different classes. In this case, the classifier takes multiple features of images as inputs and learns to classify them into two classes, cars and non cars.”

Generating Faces with Deep Convolutional GANs

Dominic Monn

Dominic wrote up the generative adversarial network that he trained for the Udacity Deep Learning Nanodegree Foundations Program. Normally I downvote blog posts from other programs, but Dominic is a Self-Driving Car student and DLFND is great and GANs are very cool, so I’ll let it slide.

“The training took approximately 15–20 minutes and even though the first few iterations looked like something demonic — the end result was fine.”

What Is Waymo Doing with Lyft?

Lyft and Waymo just announced an exciting but vague self-driving car partnership that further complicates the mobility ecosystem.

According to The New York Times:

“Details about the deal between Waymo and Lyft were scant. The companies declined to comment on what types of products would be brought to market as a result of it or when the public might see the fruits of the collaboration.”

Basically, there’s a deal but nobody knows what’s in it.

The big tease, of course, is that Waymo would put its self-driving cars on Lyft’s network.

That would be totally awesome, but somewhat at odds with past indications of Waymo’s plans for commercializing its self-driving technology.

Previously, Waymo has appeared to be launching a ride-sharing service, then a lidar manufacturing business, and now possibly a self-driving car operation (without owning the actual ride-sharing service). At other points, people have speculated that Waymo might try to make its software the default tech stack of autonomous vehicles, similar to what Android has become for mobile phones.

Or maybe they’ll buy Lyft.

Jensen’s GTC Keynote

NVIDIA CEO Jensen Huang is famous for his keynote addresses at the company’s GPU Technology Conference.

Jensen delivered this year’s keynote yesterday, and the focus was deep learning and artificial intelligence. NVIDIA GPUs are critical for training deep neural networks, so NVIDIA is fast becoming as much an AI company as a gaming company.

Engadget created a super-fast mashup of Jensen’s keynote, if you only have 13 minutes to catch the highlights. In particular, check out the autonomous vehicle announcements around NVIDIA Drive and Guardian Angel.

Computer Vision and Deep Learning Walkthroughs

Here are some thorough walkthroughs of how to implement lane-finding and end-to-end learning, with all sorts of corner cases.

Want to be like these all stars? Join the Udacity Self-Driving Car Nanodegree Program!

Self-driving Cars — Advanced computer vision with OpenCV, finding lane lines

Ricardo Zuccolo

Ricardo’s lane-finding pipeline works amazing well on challenge video for the Advanced Lane Finding Project. He has an incredibly thorough rundown of his pipeline: calibration, undistortion, color transforms, perspective transform, lane detection, curvature, and unwarping:

“Once you know where the lines are in one frame of video, you can do a highly targeted search for them in the next frame. This is equivalent to using a customized region of interest for each frame of video, which helps to track the lanes through sharp curves and tricky conditions.”

Deep Learning/Gaming Build with NVIDIA Titan Xp and MacBook Pro with Thunderbolt2

Yazeed Alrubyli

I just met Yazeed yesterday at NVIDIA’s GPU Technology Conference, and then I found his blog post today. He loves GPUs and deep learning so much he flew all the way from Saudi Arabia for the conference!

“For me it was a gambling to buy 1200$ Titan Xp which is just relased 17 houres ago — when I bought it — with a promise from NVIDIA to support macOS and I don’t have the Thunderbolt3 port which is the supported port for eGPU. So, I said like Richard Branson said “Screw It, Let’s Do It” and it works like a charm. without further ado, let’s dive in.”

SqueezeDet: Deep Learning for Object Detection

Mez Gebre

A while back Mez published his results on behavioral coning with SqueezeNet, but he’s back with a super-enthusiastic blog post on his network. Only 52 paramters and 6 second epochs on a CPU!

“One good rule of thumb I developed from this project is to try and reduce the number of variables you are tuning to gain better results faster.”

Behavioral Cloning

Arsen Memtov

Arsen has a great writeup on using a neural network to calculate both steering and throttle values for the Behavioral Cloning Project. Also, he uses early stopping to prevent overfitting the data.

“The validation set helped determine if the model was over or under fitting. I used EarlyStopping (utils.py line 299) to stop training when validation mse has stopped improving.”

CarND Behavioral Cloning

JC Li

JC’s writeup of his Behavioral Cloning Project covers a really important topic — how to debug, or at least visualize, what’s going on inside a neural network.

“During the process of training, I felt very uneasy as it is almost like a blackbox. Whenever the model failed to proceed at a certain spot, it is very hard to tell what went wrong. Although my model passed both tracks, the process of try and error and meddle around with different combination of configurations is quite frustrating.”

Tuesday Autonomous Vehicle Links

Delphi is splitting in two. One half will focus on powertrain and other traditional bread and butter automotive components. The other half will focus on software and electronics. Looks like Delphi sees software eating the automotive supply chain.

Mobileye banks on mapping revenue. There have been very few companies able to supply the high-definition maps that autonomous vehicles need. Mobileye plans to be one of them.

Tesla will upload video from consumer cars. I’m surprised this wasn’t already the case. Don’t use your Tesla to break the law. Or, if you do, cover up the camera.

K-City will dwarf M-City. Korea is building an urban testing environment for self-driving cars. It will be almost three times bigger than the environment that Ford and the University of Michigan built.

Uber ATG to open an office in Canada. This will be of interest to Udacity’s Canadian students.