Self-Driving Cars and Kangaroos and Localization

One of the poorly understood complications in autonomous vehicles is how much work will be involved in transferring self-driving technology from one location to another.

Driving in the United States is much different than driving in India, and in fact driving in San Francisco is different than driving in Boston or Peoria. But it’s hard to get a handle on just how big a challenge this will be until we try to transfer self-driving technology to different areas.

Volvo is running (bouncing?) into some problems with their self-driving technology in Australia, according to this delightful article from The Australian Broadcasting Corporation.

Apparently the object detection software for animals is thrown by the way kangaroos hop, which isn’t characteristic of the caribou that Volvo typically sees in Sweden.

“We’ve noticed with the kangaroo being in mid-flight … when it’s in the air it actually looks like it’s further away, then it lands and it looks closer,” Volvo Australia’s technical manager David Pickett said.

Not to worry, though. According to The Guardian:

“We are developing a car that can recognise kangaroos,” he said.

Autonomous Denver

Thanks so much to Nate and Aaron and Myles and the entire Autonomous Denver Meetup for hosting me on Tuesday night. And thank you to Uber, for letting us meet in their Louisville, Colorado, office!

It’s always delightful to meet current and potential students and people who are interested in self-driving cars all around the world. This one was special because I love Colorado so much.

Nate arranged for a recording of my presentation about Carla, the Udacity self-driving car, which was awesome. Unfortunately, there were some AV glitches, so the recording comes in two parts and a small portion in the middle was lost.

It was a fun conversation, though, and I hope to be back soon.

https://www.youtube.com/watch?v=DsA5ICRYBp8https://www.youtube.com/watch?v=appixzC6RO0

Autonomous Racing

The Udacity Self-Driving Car team celebrated a company award with a team outing to the NASCAR race at Sonoma Raceway yesterday.

It was our first NASCAR race and it was an experience.

Kevin Harvick won, but I couldn’t help but thinking of a future when autonomous cars can outrace humans. We’re not there yet, but it’s coming.

Here’s the NIO EP9, arguably the world’s fastest autonomous race car:

Here’s the Formula E autonomous series, Roborace:

And here’s a montage of the Self-Racing Cars homebrew event, with a cameo from my now-colleague, Anthony Navarro:

A Couple of Good Localization Projects

Localization is how a car finds where it is in the world.

It’s tempting to think that localization is an easy problem, since GPS-enabled smartphones are ubiquitous, but localization is actually really hard.

That’s because GPS is only accurate to within about 1 meter. If you think about how big a meter is (about three feet), if a car if off by a meter, it could be driving on the sidewalk, hitting things.

Self-driving cars need single-digit-level localization accuracy. To that end, we used sensor measurements and maps and sophisticated mathematical algorithms to localize the vehicle.

Here are some localization projects that Udacity Self-Driving Car students have published!

Tracking a self-driving car with high precision

Priya Dwivedi

Privya has a terrific description of her localization project, along with a video. Plus, she asks lots of great questions about how to localize in complicated scenarios, which is what we have to deal with in the real world.

“Here is an interesting question? — How would we use this technique for a real self driving car traveling between City A and City B? Particle filters assumes we have a map of the world with known location of many landmarks. How can we determine location of hundreds of landmarks and feed those to the car?”

Self-Driving Car Engineer Diary — 9

Andrew Wilkie

If you’re looking for a step-by-step walkthrough of the particle filter algorithm, Andrew combines a review of material from the Udacity localization lesson with his own observations.

“Update Weights : These measurements form the weight of each particle by applying the multi-variate gaussian probability density function. This function tells us how likely a set of landmark measurements is given, our predicted state of our car.”

Bosch Automated Mobility Academy

Bosch, the world’s largest automotive supplier (and also a Udacity Self-Driving Car partner) has published a neat Automated Mobility Academy that describes different levels of autonomous driving and key features at each level.

It’s basically a web series, free and targeted at the general public, that describes different parts of the autonomous vehicle ecosystem.

The stages are: Driver Assistance, Partially Automated Driving, Conditional and Highly Automated Driving, and Fully Automated Driving. These pages correspond roughly to Levels 2 through 5 of the SAE autonomy levels.

This is a much lighter treatment of autonomous vehicles than the Udacity Self-Driving Car Engineer Nanodegree Program, but it’s short and concise and accessible to the general public.

If you’re interested in learning a little bit about how self-driving cars work, you should check it out!

Upcoming Autonomous Denver Meetup

The David Silver — Udacity world tour continues next Tuesday, in Denver, Colorado! Come by the Autonomous Denver Meetup, graciously hosted at Uber’s Louisville, Colorado, office.

I’ll be presenting an overview of Carla, the Udacity Self-Driving Car that can drive itself from Mountain View to San Francisco.

Carla herself will not be present, sadly, but there will be lots of good videos and sensor data.

There will also an all-questions-welcome Q&A about Udacity, the Nanodegree Program, self-driving cars, and more. Please come!

See you in the Mile High City 🙂

DIY Robocar Meetup

This weekend I went to the DIY Robocar Meetup in Oakland, which is an awesome event if you’re excited about autonomous vehicles.

Lots of people gather in a warehouse of a day of hacking on miniature autonomous vehicles, and the day is capped off by a time trial.

I went to watch and to get my son out of the house so my wife could have the day to herself, but it was fun to meet lots of Udacity students who were participating.

Mr. Evan

All of them attested to how much they were learning by putting their skills to use on actual embedded hardware that had to run in realtime.

There are several of these Meetups springing up around the world, and there are even a number of kits you can buy to get up and running quickly. So if you are interested in the field, consider trying it out! And if there’s not a similar event near you, maybe you can start one 🙂

Here’s the second-place car for the weekend:

Amazon and Whole Foods and Self-Driving Cars

News broke today that Amazon will be acquiring Whole Foods for $13.4 billion.

There are a lot of interesting angles here, but of course I’m particularly interested in the autonomous vehicle angle.

Amazon has generally been wildly successful (understatement?), but it has had a hard time cracking the grocery market.

“…the e-commerce giant has long wanted to figure out the online groceries game. It started testing delivery concepts in August 2007, when it unveiled Amazon Fresh — delivering produce and pantry staples through its fulfillment centers. Yet even after a decade — eons in Silicon Valley time — it’s still trying. Turns out, the instant gratification business doesn’t quite work with fresh food.”

Forbes goes on to talk about the difficulties inherent to grocery retailing:

“A lot of the stuff you buy in a grocery store spoils easily, which means you have to get them home quickly — plus, someone has to be there to receive the goods. That can be tricky, given how Amazon likes to optimize delivery routes and bundle items to maximize efficiency.

This is precisely where self-driving cars come in. Optimizing delivery routes and bundling become drastically less important when the marginal cost of a delivery plummets by 50% or more.

AmazonFresh Pickup and Amazon Go are new spins on the grocery store, but neither of them seems particularly disruptive — more like a new spin on an old model. And the cognitive load of switching to a new commerce model might not be worth the relatively small benefit.

A deliver-groceries-to-me-right-now service, though, seems like it could make a lot of people think twice about piling into the car to go to Safeway.

3 Approaches to Vehicle Detection and Tracking

Three Udacity students each took different approaches to vehicle detection and tracking — some using deep learning and others using standard computer vision. Here’s what they learned!

Vehicle Detection and Tracking

Ivan Kazakov

Ivan has a terrific writeup of how to use deep learning for vehicle detection. He builds a model based on Faster-RCNN, but smaller and faster.

“The main idea is that since there is a binary classification problem (vehicle/non-vehicle), a model could be constructed in such a way that it would have an input size of a small training sample (e.g., 64×64) and a single-feature convolutional layer of 1×1 at the top, which output could be used as a probability value for classification.”

Udacity Self Driving Car Engineering Project 5 — Vehicle Detection

Martijn de Boer

https://www.youtube.com/watch?v=7h1iv-9sqys

Martijn uses a HOG and SVM approach to build a vehicle detection pipeline. He encountered some issues with noise and finds a creative solution.

“I was advised do try Hard Negative Mining to train my model more accurate, so I captured multiple images of the shadows / threes and added them to the non car image dataset. (to classify them among the non-car classes instead of the car classes)”

Automatic Vehicle Detection for Self Driving Cars

Priya Dwivedi

Priya uses a HOG and SVM approach to vehicle detection. By combining those with a threshold over time, she achieves great performance. She discusses some of the tradeoffs, however.

“Firstly, I am not sure this model would perform well when it is a heavy traffic situations when there are multiple vehicles. You need something with near perfect accuracy to avoid bumping into other cars or to ensure there are no crashes on a crossing. More importantly, the model was slow to run. It took 6–7 minutes to process 1 minute of video. I am not sure this model would work in a real life situation with cars and pedestrians on the road.”