Autonomous Vehicle Roundup

Google Spent $1.1 BB on Waymo So Far: In light of Mobileye, Cruise, Otto, and Argo AI, this seems like a steal.

Ford Faked a Self-Driving Car to Test out Human-Machine Interaction: It’s not clear to me that they really needed to fake the absence of a driver.

New Robotics Grads in Pittsburgh are Making $200,000 per Year: Carnegie-Mellon University basically caused this single-handedly. The returns to city from hosting a top-flight research university are looking pretty good.

Facebook’s Automotive Strategy

“I come with very good news. We’re the only company in Silicon Valley that’s not building a car.”

I wasn’t there, but apparently Sheryl Sandberg got some good laughs with that line at IAA New Mobility World yesterday.

So why is Facebook sponsoring one of the key events at the world’s largest automotive show?

Facebook does have an automotive strategy, and it seems to revolve around virtual reality and augmented reality.

The idea is that Facebook’s Oculus headsets could be used everywhere from the factory floor — where they would help autoworkers assemble vehicles more quickly — to in the car, where self-driving technology would free passengers to enjoy in-car virtual reality experiences.

So far it all seems to make sense, but it also seems a little vague. I’ll be excited when we see a specific product or service related to self-driving cars that Facebook wants to launch.

Udacity at IAA

Udacity will be at the International Motor Show (IAA) in Frankfurt, Germany, this week!

Wednesday

I’ll be flying over on Wednesday as part of Lufthansa’s FlyingLab which is a little bit like South by Southwest in the sky.

Friday

My main event in Frankfurt will be at the me Convention, which is a conference put on by Mercedes-Benz in conjunction with SXSW. On Friday afternoon I’ll be speaking on a panel entitled, “Teaching Machines to Drive Like Humans”, with Sarah Marie Thornton from Stanford, and Danny Shapiro from NVIDIA. Come say hello!

Late Friday afternoon, Udacity will be at the Speaker’s Corner at the IAA New Mobility World. Meet our European team and ask me as many questions as you like about Udacity’s Self-Driving Car Engineer Nanodegree Program.

Saturday

Saturday afternoon we are excited to be holding a career workshop for Udacity students. Students will have the opportunity to take professional headshots, get career coaching from Udacity experts, and hear from me about hiring in the autonomous vehicle industry. This event is full, but drop a note here if this is something you’d be interested in attending for the future.

Saturday evening I’ll be hosting a dinner for Udacity Self-Driving Car students in Frankfurt. This event is full, too, but I am excited to meet everyone and we are working hard to squeeze people in. So if you are an enrolled student and would like to attend, send me an email to david.silver@udacity.com and we’ll find a space.

Sunday

I have to leave Germany on Sunday, sadly 😦

But the reason I’m rushing home is to get to TechCrunch Disrupt in San Francisco. Udacity has some big announcements coming out next week, so keep your ear to the ground for that!


If you’d like to say hello in Frankfurt, but won’t be able to make any of the events I mentioned here, email me at david.silver@udacity.com. I will have a little bit of down time between events and I’m excited to meet people on my first trip to Germany!

Chris Urmson on Recode Decode

On the latest episode of the Recode Decode podcast, Kara Swisher interviews Chris Urmson.

Urmson’s Carnegie Mellon team won the DARPA Urban Challenge in 2006. From there he joined my current boss, Sebastian Thrun, at the Google Self-Driving Car Project, back when Sebastian was running that. Urmson led the Google Self-Driving Car Project himself after Sebastian departed to launch Udacity. Most recently, Urmson left Google to found Aurora Innovation, his own self-driving car startup.

The interview is an hour long and Swisher lives up to her reputation as the best of tech journalists. They cover the early days of the DARPA challenges, how Urmson got to Google, why he left Google, what Aurora is doing, how the automotive industry and tech industry will partner and compete, when we will see fully autonomous vehicles on the road, and the moral responsibility autonomous vehicle engineers have to the drivers they might put out of work.

Listen to the whole thing.

https://art19.com/shows/recode-decode/episodes/e6ae4866-7f89-4674-810b-63fa991b7dce

Lyft and Drive.ai

Lyft and Drive.ai will be offering free rides in self-driving cars to riders in the Bay Area who opt in. Soon, supposedly.

Lyft continues to build an impressive list of partnerships: GM, Jaguar Land Rover, nuTonomy.

The Drive.ai partnership sounds especially promising, since its written with the tone that this effort is going to start soon, maybe before the end of the year.

There are self-driving cars being tested on public streets in cities around the world: Pittsburgh, Phoenix, San Francisco, Singapore, Austin. Some of those cities even have self-driving cars testing with a limited set of pre-screened passengers.

But right now, as far as I know, there is only one city where any civilian off the street can show up and hail a self-driving vehicle. That would be Pittsburgh, where Uber has opened their fleet to public use (with safety drivers).

Drive.ai testing in the Bay Area would bring that number to two.

The SELF DRIVE Act

The Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution (SELF DRIVE) Act just passed the House of Representatives unanimously. Congress loves a good backronym.

Wired has a nice rundown of the key parts of the bill. The major feature seems to be moving autonomous vehicle regulation from a patchwork of 50 different state regimes over to a single set of federal regulations. You’ll no longer have to worry about getting arrested when you take your self-driving car across state lines.

The law was pushed by a consortium of interested companies — Ford, Waymo, Lyft, Uber, and Volvo. To be honest, on self-driving cars, I trust my old employers at Ford more than I trust any department of motor vehicles.

The “unanimous” passage of the bill makes me a little nervous — as if not enough people were paying attention. Surely there is something in there somebody objects to. Personally, I kind of like the “laboratories of democracy” aspect of state regulation.

But since it seems destined to become law anyway, we should all hope this bill speeds the development of safe autonomous vehicles that prevent many of the 35,000 annual motor vehicle fatalities in the US.

How the Udacity Self-Driving Car Works

At Udacity, where I work, we have a self-driving car. Her name is Carla.

Carla’s technology is divided into four subsystems: sensors, perception, planning, and control.

Sensors

Carla’s sensor subsystem encompasses the physical hardware that gathers data about the environment.

For example, Carla has cameras mounted behind the top of the windshield. There are usually between one and three cameras lined up in a row, although we can add or remove cameras as our needs change.

Carla also has a single front-facing radar, embedded in the bumper, and one 360-degree lidar, mounted on the roof.

This is what a lidar sensor looks like, and this is what lidar data looks like. It’s a point cloud.

Sometimes Carla will utilize other sensors, too, like GPS, IMU, and ultrasonic sensors.

Data from these sensors flows into various components of the perception subsystem.

Perception

Carla’s perception subsystem translates raw sensor data into meaningful intelligence about her environment. The components of the perception subsystem can be grouped into two blocks: detection and localization.

The detection block uses sensor information to detect objects outside the vehicle. These detection components include traffic light detection and classification, object detection and tracking, and free space detection.

The localization block determines where the vehicle is in the world. This is harder than it sounds. GPS can help, but GPS is only accurate to within 1–2 meters. For a car, a 1–2 meter error range is unacceptably large. A car that thinks it’s in the center of a lane could be off by 1–2 meters and really be on the sidewalk, running into things. We need to do a lot better than the 1–2 meters of accuracy that GPS provides.

Fortunately, we can localize Carla to within 10 centimeters or less, using a combination of high-definition maps, Carla’s own lidar sensor, and sophisticated mathematical algorithms. Carla’s lidar scans the environment, compares what it sees to a high-definition map, and then determines a precise location.

Carla localizes herself by figuring out where she is on a high-definition map.

The components of the perception subsystem route their output to the planning subsystem.

Planning

Carla has a straightforward planning subsystem. The planner builds a series of waypoints for Carla to follow. These waypoints are just spots on the road that Carla needs to drive over.

Each waypoint has a specific location and associated target velocity that Carla should match when she passes through that waypoint.

Carla’s planner uses the perception data to predict the movements of other vehicles on the road and update the waypoints accordingly.

For example, if the planning subsystem were to predict that the vehicle in front of Carla would be slowing down, then Carla’s own planner would likely decide to decelerate.

The final step in the planning process would be for the trajectory generation component to build new waypoints that have slower target velocities, since in this example Carla would be slowing down as she passes through the upcoming waypoints.

Similar calculations affect how the planning subsystem treats traffic lights and traffic signs.

Once the planner has generated a trajectory of new waypoints, this trajectory is passed to the final subsystem, the control subsystem.

Control

The control subsystem actuates the vehicle by sending acceleration, brake, and steering messages. Some of these messages are purely electronic, and others have a physical manifestation. For example, if you ride in Carla, you will actually see the steering wheel turn itself.

The control subsystem takes as input the list of waypoints and target velocities generated by the planning subsystem. Then the control subsystem passes these waypoints and velocities to an algorithm, which calculates just how much to steer, accelerate, or brake, in order to hit the target trajectory.

There are many different algorithms that the control subsystem can use to map waypoints to steering and throttle commands. These different algorithms are called, appropriately enough, controllers.

Carla uses a fairly simple proportional-integral-derivative (PID) controller, but more sophisticated controllers are possible.

A Self-Driving Car

That’s how Carla works!

First, the sensor subsystem collects data from Carla’s cameras, radar, and lidar. The perception subsystem uses that sensor data to detect objects in the world and localize the vehicle within its environment. Next, the planning subsystem uses that environmental data to build a trajectory for Carla to follow. Finally, the control system turns the steering wheel and fires the accelerator and brake in order to move the car along that trajectory.

We’re very proud of Carla. She’s driven from Mountain View to San Francisco, and done lots of driving on our test track.

The most exciting thing about Carla is that every Udacity student gets to load their code onto her computer at the end of the Nanodegree Program and see how well she drives on our test track.

If you’re interested in learning about how to program self-driving cars, and you want to try out driving Carla yourself, you should sign up for the Udacity Self-Driving Car Engineer Nanodegree Program!

California DMV Autonomous Vehicle List

There are now 39 companies on the list of authorized testers of autonomous vehicles in California.

The list, which I believe is in chronological order, starts with big automotive companies that everybody recognizes, and then includes many smaller, anonymous startups toward the end. Udacity is pretty much in the middle 🙂

An interesting exercise would be to dive into this list and figure out where each company is with their self-driving car development efforts.

  • Volkswagen Group of America
  • Mercedes Benz
  • Waymo
  • Delphi Automotive
  • Tesla Motors
  • Bosch
  • Nissan
  • GM Cruise LLC
  • BMW
  • Honda
  • Ford
  • Zoox, Inc.
  • Drive.ai, Inc.
  • Faraday & Future Inc.
  • Baidu USA LLC
  • Wheego Electric Cars Inc.
  • Valeo North America, Inc.
  • NextEV USA, Inc.
  • Telenav, Inc.
  • NVIDIA Corporation
  • AutoX Technologies Inc
  • Subaru
  • Udacity, Inc
  • Navya Inc.
  • Renovo.auto
  • UATC LLC (Uber)
  • PlusAi Inc
  • Nuro, Inc
  • CarOne LLC
  • Apple Inc.
  • Bauer’s Intelligent Transportation
  • Pony.AI
  • TuSimple
  • Jingchi Corp
  • SAIC Innovation Center, LLC
  • Almotive Inc
  • Aurora Innovation
  • Nullmax
  • Samsung Electronics

Happy Birthday to Me

Not a stock photo.

Today is my 36th birthday, which officially pushes me into middle age on those surveys that ask if you are 25 or younger, 26–35, 36–45, and so on. That’s a little rough, but otherwise this has been a great year, professionally and personally.

12 months ago, on my 35th birthday, I was leading a small team that was hustling to put together enough of a self-driving car curriculum to release before the holidays.

12 months later, today, our small team has grown and turned over, and we’ve released a nine-month program that has helped thousands of students prepare for roles working on autonomous vehicles. Our first students will graduate in the coming months, and we’ve already seen many students land jobs in the field, even before graduation.

I hope in another twelve months I can circle back and say that this coming year has been even better.