CarND: Experiences and Lessons Learned

A few days ago George Sung, who is in the first cohort of students of the Udacity Self-Driving Car Nanodegree Program, gave a presentation about his experience in the program to the Boston Self-Driving Cars Meetup.

It’s a thorough overview of the program so far. If you’re interested in signing up for the Nanodegree program, or if you’re already a student and interested in how another student has experienced it, it’s worth a watch.

George’s presentation starts at about 18:40 on the video.

Driving Outside the Lines

Last week my colleague Lisbeth organized a really fun event for Udacity Self-Driving Car Students.

We rented out an auditorium at the Computer History Museum in Mountain View, and had several amazing engineers talk about their work on autonomous vehicles.

The opening act was a tag-team of my colleagues Mac and Eric, talking about their work on Udacity’s own open-source self-driving car:

The main event was Sebastian Thrun interviewing Axel Gern, Head of Autonomous Driving at Mercedes-Benz North America:

And the encore was the ever-exciting George Hotz announcing Comma.ai’s new strategy:

Fun fact — at the beginning of George’s presentation, you’ll see him gesturing off-stage to some anonymous person to advance his slides. That was me. Working at a startup means wearing a lot of hats 😉

Delphi and Mobileye Roll Together

The San Francisco Chronicle got an up-close and personal look at Delphi’s partnership with Mobileye, and the self-driving cars that partnership has produced:

With the race to develop self-driving cars now at an all-out sprint, Delphi and Mobileye believe they possess an edge.

They have developed a system for crowdsourcing the hyper-detailed 3-D maps upon which autonomous vehicles rely. Millions of non-autonomous cars that use Mobileye cameras for lane keeping or collision prevention will create a constant stream of data to map roads and potential obstacles, even temporary ones such as road repair crews or double-parked cars.

Also, this:

“You can’t develop autonomous cars that just follow all the rules, because they’ll just clog cities,” [Mobileye executive Dan] Galves said. “The point is really providing the intelligence and the rules of breaking the rules, if you will — providing some human intuition into the vehicles.”

Self-Driving Cars and Regulators

Business Insider has a great inside-baseball story on the early days of Otto, particularly the negotiations and maneuvering that took place in running their first self-driving truck tests in Nevada.

The Nevada regulatory bureaucracy is generally very amenable toward autonomous vehicles, but there’s still a certain amount of required testing and licensing.

According to the BI article, Otto had to figure out a way around that in order to keep up their frantic development pace:

Before an autonomous vehicle can be operated on the state’s roads, it must be issued a testing license and special red license plates. It has to be able to capture driving data in case of crashes, have switches to engage and disengage the autonomous systems, and have a way to alert the human operator if it fails.

To obtain a license, Otto would have had to produce evidence of 10,000 miles of previous autonomous operation and submit a truck for a self-driving test, such as the one completed by Google in 2012. It would also need to post a $5 million bond and file reams of paperwork. Even with all those requirements fulfilled, Otto’s demo would need two people seated up front, one of them poised to take over in the event of a failure.

Otto’s founders were faced with a stark choice. They could submit to the DMV and undertake the laborious process of modifying, testing, and licensing their truck. This would likely take a month or more, and could risk their first-mover advantage in driverless trucking. Or the engineers could continue with their test as planned.

Nissan’s Autonomous Towing System

Logistics is one of the industries most ripe for autonomous vehicles. A lot of logistics involves traversing the same areas over and over again. And the vehicles are in use more or less constantly, as opposed to personal vehicles, which are parked 95% of the time.

In that vein, Nissan has just announced an autonomous towing system they’ve put into use at one of their plants in Japan.

And the video continues in the vein of autonomous vehicle videos with rather whimsical musical selections:

Apple Opens Up (A Little)

Apple has always been one of the more secretive companies in Silicon Valley, especially with their self-driving car effort. The Apple has always refused to confirm the existence of its work on self-driving cars, although there is a lot of circumstantial evidence that the company is working in the field.

Apple took a couple of steps recently to confirm its work in autonomous vehicles, though. According to Quartz:

Apple, unsurprisingly, is working on a lot of the same problems as other companies that are exploring machine learning: recognizing and processing images, predicting user behavior and events in the physical world, modeling language for use in personal assistants, and trying to understand how to deal with uncertainty when an algorithm can’t make a high-confidence decision.

And this:

Apple has been long-rumored to be building an autonomous vehicle, even sending a letter to the US National Highway Traffic Safety Administration (NHTSA) saying that companies looking to test autonomous vehicles should be treated the same, whether they are new to the field or more established self-driving car shops like Google and Uber.

Blurring the Line Between Level 2 and Level 3

One of the big debates in autonomous vehicle development is whether it’s safe to build a Level 3 autonomous vehicle. This would be a vehicle that could drive itself independently, but could throw back control to the human driver at any moment.

Ford, for example, rather prominently believes that Level 3 vehicles are unsafe.

Tesla, on the other hand, seems to be iteratively developing a Level 3 vehicle.

GM just released their plan, and it reads like something that blurs the line between Level 2 (ADAS) and Level 3:

General Motors’s semi-autonomous “Super Cruise” system will allow drivers to take their hands off the steering wheel for extended periods, but will stop the vehicle automatically if drivers are not attentive, according to a government letter made public on Monday.

The largest U.S. automaker in September 2014 unveiled planned technology to allow drivers on highways to let the vehicle take over driving itself.

But if the road has too many twists and turns or the vehicle detects the driver is not paying attention, it issues a series of alerts. If the human driver does not take over, the vehicle will automatically slow down and then put on the hazard lights.

I guess that just goes to show that it’s hard to draw bright lines in this space.

TensorFlow on Windows

TensorFlow is the main deep learning library we are using in the Udacity Self-Driving Car Engineer Nanodegree Program, and it’s been a little bit painful because of the lack of Windows support.

We’ve had to work with our Windows users to set up Docker containers in which to run TensorFlow, and frankly we haven’t done as good a job with that as we should.

So thank goodness Google announced yesterday that they’re releasing Windows support for TensorFlow.

It looks to be early stages and I’m not sure if we can safely point our students there yet, but hopefully it means we can get there soon.

And it’s also another step toward TensorFlow becoming the library of choice for deep learning.

ALVINN and Udacity

Deep neural networks actually have quite a long history of powering self-driving cars. Way back in the 1980s, researchers at CMU used a basic two-layer neural network to power a truck.

We almost covered ALVINN (Autonomous Land Vehicle in a Neural Network) within the Deep Neural Networks module of the Udacity Self-Driving Car Nanodegree Program, but we cut it for time.

Recently, Udacity’s work on neural networks and self-driving cars has reminded people about what a breakthrough ALVINN was.

This proto-driverless vehicle [ALVINN] came up recently in a Twitter discussion between two engineers: Oliver Cameron, who heads an open-source self-driving car project at Udacity, and Dean Pomerleau, a CMU professor who ran the self-driving car project that gave birth to ALVINN. Cameron tweeted a video shared by some of his students of a car steering itself autonomously using only a camera.

This prompted Pomerleau to ask a few questions about deep learning and neural networks. After some back and forth, Pomerleau brought up ALVINN, which had an operating system of 100 million floating point-operations per second, or about one-tenth the processing power of the Apple Watch. The vehicle’s CPU was the size of a refrigerator and was powered by a 5,000 watt generator, he added. Nonetheless, ALVINN was able to hit 70 mph by the early 1990s.

From The Verge. Read the whole thing.

Localization with Ground-Penetrating Radar

Yesterday, I drove from my in-laws’ house in Sacramento, up into the mountains to go cross-country skiing near Lake Tahoe. It was a lot of fun! The snow was surprisingly good for November.

The drive down was a spectacle, though. Snow was starting to come down and the Interstate highway leading down from the mountains turned into a parking lot. Cars got stuck and stopped in the middle of the highway to put on snow chains. It became impossible to see where the lane lines were.

This got me thinking about what a mess driving in the snow can be, and all the more so for self-driving cars. Sensors like cameras and lidar can become mostly useless in the snow, which completely changes the physical surface of the road.

One solution to this problem is a research project out of MIT: ground-penetrating radar.

This technique uses VHF radar to scan through the surface of the road and down to subterranean elements. Supposedly, similar techniques are used by archeaologists.

Obviously, the below-ground composition of the earth doesn’t change with the weather. So if a vehicle can map the ground underneath the road once, then it can come back later and figure out where it is, even in a blinding snowstorm.

Pretty cool stuff.

https://www.youtube.com/watch?v=rZq5FMwl8D4