Apple Opens Up (A Little)

Apple has always been one of the more secretive companies in Silicon Valley, especially with their self-driving car effort. The Apple has always refused to confirm the existence of its work on self-driving cars, although there is a lot of circumstantial evidence that the company is working in the field.

Apple took a couple of steps recently to confirm its work in autonomous vehicles, though. According to Quartz:

Apple, unsurprisingly, is working on a lot of the same problems as other companies that are exploring machine learning: recognizing and processing images, predicting user behavior and events in the physical world, modeling language for use in personal assistants, and trying to understand how to deal with uncertainty when an algorithm can’t make a high-confidence decision.

And this:

Apple has been long-rumored to be building an autonomous vehicle, even sending a letter to the US National Highway Traffic Safety Administration (NHTSA) saying that companies looking to test autonomous vehicles should be treated the same, whether they are new to the field or more established self-driving car shops like Google and Uber.

Blurring the Line Between Level 2 and Level 3

One of the big debates in autonomous vehicle development is whether it’s safe to build a Level 3 autonomous vehicle. This would be a vehicle that could drive itself independently, but could throw back control to the human driver at any moment.

Ford, for example, rather prominently believes that Level 3 vehicles are unsafe.

Tesla, on the other hand, seems to be iteratively developing a Level 3 vehicle.

GM just released their plan, and it reads like something that blurs the line between Level 2 (ADAS) and Level 3:

General Motors’s semi-autonomous “Super Cruise” system will allow drivers to take their hands off the steering wheel for extended periods, but will stop the vehicle automatically if drivers are not attentive, according to a government letter made public on Monday.

The largest U.S. automaker in September 2014 unveiled planned technology to allow drivers on highways to let the vehicle take over driving itself.

But if the road has too many twists and turns or the vehicle detects the driver is not paying attention, it issues a series of alerts. If the human driver does not take over, the vehicle will automatically slow down and then put on the hazard lights.

I guess that just goes to show that it’s hard to draw bright lines in this space.

ALVINN and Udacity

Deep neural networks actually have quite a long history of powering self-driving cars. Way back in the 1980s, researchers at CMU used a basic two-layer neural network to power a truck.

We almost covered ALVINN (Autonomous Land Vehicle in a Neural Network) within the Deep Neural Networks module of the Udacity Self-Driving Car Nanodegree Program, but we cut it for time.

Recently, Udacity’s work on neural networks and self-driving cars has reminded people about what a breakthrough ALVINN was.

This proto-driverless vehicle [ALVINN] came up recently in a Twitter discussion between two engineers: Oliver Cameron, who heads an open-source self-driving car project at Udacity, and Dean Pomerleau, a CMU professor who ran the self-driving car project that gave birth to ALVINN. Cameron tweeted a video shared by some of his students of a car steering itself autonomously using only a camera.

This prompted Pomerleau to ask a few questions about deep learning and neural networks. After some back and forth, Pomerleau brought up ALVINN, which had an operating system of 100 million floating point-operations per second, or about one-tenth the processing power of the Apple Watch. The vehicle’s CPU was the size of a refrigerator and was powered by a 5,000 watt generator, he added. Nonetheless, ALVINN was able to hit 70 mph by the early 1990s.

From The Verge. Read the whole thing.

Localization with Ground-Penetrating Radar

Yesterday, I drove from my in-laws’ house in Sacramento, up into the mountains to go cross-country skiing near Lake Tahoe. It was a lot of fun! The snow was surprisingly good for November.

The drive down was a spectacle, though. Snow was starting to come down and the Interstate highway leading down from the mountains turned into a parking lot. Cars got stuck and stopped in the middle of the highway to put on snow chains. It became impossible to see where the lane lines were.

This got me thinking about what a mess driving in the snow can be, and all the more so for self-driving cars. Sensors like cameras and lidar can become mostly useless in the snow, which completely changes the physical surface of the road.

One solution to this problem is a research project out of MIT: ground-penetrating radar.

This technique uses VHF radar to scan through the surface of the road and down to subterranean elements. Supposedly, similar techniques are used by archeaologists.

Obviously, the below-ground composition of the earth doesn’t change with the weather. So if a vehicle can map the ground underneath the road once, then it can come back later and figure out where it is, even in a blinding snowstorm.

Pretty cool stuff.

https://www.youtube.com/watch?v=rZq5FMwl8D4

Happy Thanksgiving!

Happy Thanksgiving! At least for those of us living in the United States.

I have a lot to be thankful for this year. A new son, a wonderful family, friends, a great job working in a field that I love with a terrific team.

Nothing’s ever perfect (which, for example, is why my wife is giving me dirty looks while I edit Udacity lessons after Thanksgiving dinner), but it’s been a great year for me.

It’s been a pretty good year for the autonomous vehicle industry, too.

  • Production trials with real customers in Pittsburgh and Singapore
  • Tesla Autopilot is only a year old (!)
  • Ford is growing its autonomous vehicle team by multiples
  • Huge acquisitions of Cruise and Otto

Maybe at this time next year I’ll be thankful for my very own self-driving car 🙂

You Have to Say Yes

A couple of people wrote me today to ask for career advice. In both cases, they were really excited about working on self-driving cars and they had been offered jobs in the automotive industry. The jobs involved big pay cuts and were imperfect in other ways.

I tried to offer specific advice to each person, but after I fired off my two cents, I reflected back on some advice I received myself, several years ago.

I went to a talk with Marketplace radio host Kai Ryssdal, only because my wife has a crush on him.

But Kai turns out to have a pretty interesting life story. He flew planes for the US Navy, then worked in China with the US Foreign Service, and then wound up as an unhappy 34 year-old civilian shelving books at Borders while his wife was in grad school.

He had an interest in journalism, but no experience, so he applied for an unpaid internship with a San Francisco radio station.

One thing led to another, and eventually he became a (minor) national radio celebrity.

I tried to find a version of Kai’s talk online, but this is the closest I could locate:

Never say no. If someone says, “Can you come in on Sunday and go to Chinatown to get us some tape for the Monday broadcast,” you have to say yes. And that goes now more than ever in journalism, when it’s so hard to find really good work. If you have an opportunity, you absolutely have to grab it.

This was pretty important career advice for me personally, as it really helped push me into the opportunity I was offered at Ford. And my wife was supportive because, after all, Kai Ryssdal basically told me to take the job.

Another version of Kai’s fascinating life story is here, although he doesn’t drop the “you have to say yes” line:

https://youtu.be/I4U728fR0Nk?t=21m5s

The First Self-Driving Cities

I participated in an interview last week in which Alexy Khrabrov asked me about my vision for the next year in self-driving cars.

My guess is that over the next year we will start to see lots of cities crop up in which it is possible for a normal person to catch a ride in a self-driving car.

As far as I know, this is only possible right now in Singapore — where nuTonomy is running its self-driving taxis — and Pittsburgh — where Uber is running self-driving cars. (There are other locations with very restricted self-driving vehicles — like autonomous buses running on short, fixed routes.)

Today nuTonomy signed a deal with the city of Boston to start a self-driving car program in that city, which is also the hometown of nuTonomy, an MIT spinout.

So that brings the number of cities to three. It’s not hard to imagine similar programs in San Francisco and Mountain View (Google), Detroit (Ford, GM, Delphi), Stuttgart (Mercedes), Munich (BMW), and London (Delphi).

Hopefully imagination will become reality in 2017.

Autonomous Shipping Containers

An interesting angle on autonomous vehicles that was recently pointed out to me is the rise of vehicles with no passenger whatsoever.

This seemed obvious as soon as somebody spelled it out, but I had never really dwelt on the ramifications.

Commercial transportation often has two components: a cab and a trailer. The purpose of the cab is to provide power and (human) control, while the trailer contains the load.

With autonomous vehicles, human control is no longer necessary and I can imagine removing most of the cab. Basically what we wind up with is autonomous shipping containers.

Imagine a long stretch of rural highway where most traffic consists of self-driving shipping containers with no humans in sight. It’s kind of a wild vision.