Teleoperation

Nissan is considering building out its self-driving cars in conjunction with call centers staffed by representatives who can help the self-driving cars get out of sticky situations.

According to Wired, Nissan calls this system “teleoperation” and views it as unavoidable, at least in the short term. Weird things happen on the road and the car won’t be able to figure it all out on its own. If the car also doesn’t have a steering wheel, that leaves tele-drivers as the next best option.

On the one hand, this makes perfect sense. Wired draws an analogy to an elevator, which almost always has an emergency call button that presumably (I’ve never tried) dials somebody who can help.

The rub seems to come down to whether these call centers are staffed by representatives who only step in at critical junctures, or whether the vehicles are really “teleoperated”.

The latter doesn’t seem safe (latency being a big problem) and doesn’t seem like a big improvement over normal human driving, but it’s an interesting minimum viable product.

Frank Chen’s 16 Questions About Self-Driving Car

Frank Chen is a Partner at Andreesen Horowitz who publishes terrific summaries of exciting area of technology.

A while back we talked on the phone about a self-driving car presentation he was thinking of putting together.

He put it together and it’s a terrific discussion of 16 Questions About Self-Driving Cars.

  1. Straight the Level 5 or not?
  2. LIDAR or not?
  3. Pre-computed HD maps, or build on the fly?
  4. What blend of computation techniques?
  5. How much real world vs. virtual world testing?
  6. Will V2X radios play an important role?
  7. Can we get rid of traffic lights and four way stops?
  8. How will automakers “localize” their cars?
  9. How will accident rates trend?
  10. When will it become illegal to drive?
  11. How will insurance change?
  12. Who will win? Silicon Valley vs. China vs. Incumbents
  13. Will we buy cars or transportation as a service?
  14. How will commute times change?
  15. How will cities change?
  16. When will this start, and then how quickly will we change to autonomous cars?

Watch the whole presentation!

Mercedes Drive Pilot

The Verge took Mercedes Drive Pilot on the road and it says it looks amazing, “better than Elon’s”.

Drive Pilot is to the steering wheel what adaptive cruise is to stop and go pedals. Like Tesla’s Autopilot, the Mercedes system allows the driver to hand over direct control of steering and speed, while still supervising the overall operation of the car. Think of the driver as a manager in charge of employees: they’re controlling overall direction, but not micromanaging each individual operation.

And:

The system adapts to how much steering force is used, which allows the driver to decide exactly how much input to give. Use a light touch and the steering assist does most of the work. Apply a firmer hand and the system seamlessly gives up control. With Tesla’s Autopilot, applying steering force results in a slightly alarming jerk of the wheel when the system disengages. Mercedes engineers told me they wanted anyone to be able to take control of the car without any difficulty, noting more than once that the driver was always in charge, no matter how much work the car was doing on their behalf.

This is particularly exciting for me because Mercedes-Benz has been a huge supporter of the Udacity Self-Driving Car Engineer Nanodegree Program. In fact, Mercedes-Benz engineers are personally designing and teaching large parts of our upcoming Sensor Fusion and Localization modules.

Sign up to join us if you want to learn from the best!

Global Artificial Intelligence Conference

On Friday, January 20th, I will be speaking at the Global Artificial Intelligence Conference in Santa Clara on “How to Become a Self-Driving Car Engineer”.

The talk will be similar to one I gave for the Bay Area AI Meetup, but with a new live-coding exercise!

Come out to say hello and learn how to become a self-driving car engineer with Udacity’s Self-Driving Car Engineer Nanodegree Program.

First News Out of CES

Our partner HARMAN has a new concept out for how to handle human-vehicle interaction in a Level 3 autonomous vehicle!

Full Windshield Heads-Up Display: Oasis utilizes the full windshield to project navigation prompts and other information to the driver, while also simultaneously projecting entertainment or information to the passenger.

Autonomous Drive Readiness Check — Handover to Manual: One of the most critical concerns of autonomous vehicles is how to ensure the transition between autonomous mode and manual mode is handled seamlessly. HARMAN’s solution combines haptic feedback, eye gaze tracking, and the driver’s cognitive load readiness through pupil monitoring, to ensure that the driver is truly engaged and able to safely take control of the steering wheel.

Augmented Reality Concierge: This solution addresses the need to support increased productivity in the car while minimizing distraction. A voice-controlled virtual assistant functions as a concierge, automatically suggesting and displaying personalized points of interest while enabling advanced in-vehicle productivity to join conference calls, update calendars and more. Through Skype connectivity, the system can even translate telephone conversations — in real time — with colleagues speaking different languages.

Predictive Collision Prevention: V2X (vehicle-to-vehicle and vehicle-to-infrastructure) technologies detect objects on a collision course and offer corrective action.

Intelligent E-Mirrors: Mirrors that are automatically activated/dimmed based on user gaze.

Image Augmentation

Data is the key to deep learning, and machine learning generally.

In fact, Stanford professor and machine learning guru (and Coursera founder, and Baidu Chief Scientist, and…) Andrew Ng says that it’s not the engineer with the best machine learning model that wins, rather it’s whoever has the most data.

One way to get a lot of data is to painstakingly collect a lot of it. All else equal, this is the best way to compile a huge machine learning dataset.

But all else is rarely equal, and compiling a big dataset is often prohibitively expensive.

Enter data augmentation.

The idea behind data augmentation (or image augmentation, when the data consists of images) is that an engineer can start with a relatively small data set, make lots of copies, and then perform interesting transformations on those copies. The end result will be a really large dataset.

One of the Udacity Self-Driving Car Engineer Nanodegree Program students, Vivek Yadav, has a terrific tutorial on how he used image augmentation to train his network for the Behavioral Cloning Project.

1. Augmentation:
 A. Brightness Augmentation
 B. Perspective Augmentation
 C. Horizontal and Vertical Augmentation
 D. Shadow Augmentation
 E. Flipping

2. Preprocessing

3. Sub-sampling

Read the whole thing!

Self-Driving Car Predictions for 2017

Self-Driving Car Predictions for 2017

On the first day of 2017, here’s what I think the year to come has in store for self-driving cars.

In the style of Scott Alexander, I attach confidence levels to my predictions.

  1. 1 Udacity Self-Driving Car Engineer Nanodegree Program student will start a new, permanent, full-time autonomous vehicle job: 99%
  2. Level 4 self-driving cars will be available for ridesharing on public roads somewhere in the world: 90%
  3. Level 4 self-driving cars will be available for ridesharing on public roads somewhere in the United States: 90%
  4. Ford will not push back its 2021 target launch for Level 4 vehicles: 90%
  5. 100 Udacity Self-Driving Car Engineer Nanodegree Program students will start new, permanent, full-time autonomous vehicle jobs: 90%
  6. No US highway will have a speed limit for autonomous vehicles that is faster than the speed limit for human-driven vehicles: 90%
  7. 1000 Udacity Self-Driving Car Engineer Nanodegree Program students will start new autonomous vehicle jobs (possibly contract roles or internships): 80%
  8. Level 4 self-driving cars will be available for ridesharing on public roads in Singapore: 80%
  9. A company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $100 MM USD: 80%
  10. No company will sell a vehicle with autonomous technology that exceeds what Tesla offers: 80%
  11. At least 1 company that has not done so before will complete a continuous US coast-to-coast demonstration trip in a fully autonomous vehicle: 80%
  12. Level 4 self-driving cars will be available for ridesharing on public roads in California: 70%
  13. Conditional on Level 3 self-driving cars being legal somewhere, Tesla will enable Level 3 autonomy in that location: 70%
  14. Google will offer rides to the public in its self-driving cars: 70%
  15. No member of the general public will die in a Level 4 vehicle: 70%
  16. 2500 Udacity Self-Driving Car Engineer Nanodegree Program students start new autonomous vehicle jobs (possibly contract roles or internships): 60%
  17. Level 4 self-driving cars will be available for ridesharing on public roads somewhere in Europe: 60%
  18. Level 4 self-driving cars will not be available for ridesharing on public roads somewhere in China: 60%
  19. No company will be acquired primarily for its autonomous vehicle capabilities with a valuation above $500 MM USD: 60%
  20. A new autonomous vehicle startup will form and raise money at a valuation above $75 MM USD: 60%
  21. Ford will commit to launching Level 4 vehicles before 2021: 50%
  22. Level 4 self-driving cars will be available for ridesharing on public roads somewhere in the world, without a safety driver: 50%
  23. Level 3 self-driving cars will be available for private ownership somewhere in the United States: 50%
  24. Somebody will die in a Tesla Autopilot crash: 50%
  25. Level 4 autonomous vehicles will be available for public ridesharing when snow is on the ground: 50%

Note: Udacity has partnerships with several companies I mention here, and I used to work at Ford, but none of these predictions stem from non-public information.

Update: The SAE definitions are here: https://en.wikipedia.org/wiki/Autonomous_car#Classification

Basically, a Level 3 vehicle is fully self-driving, but the driver must be able to take control of the vehicle at a moment’s notice.

A Level 4 vehicle is fully self-driving in most situations, and the driver does not need to be ready to regain control of the vehicle.

Top Posts in 2016

This blog is a quantity-over-quality endeavor. I try to post every day, and hopefully a few things resonate over the course of a year.

Here are the posts that resonated most in 2016:

Term 1: In-Depth on Udacity’s Self-Driving Car Curriculum

We need to get the curriculum off my personal blog and onto our program homepage. But for now, this is where prospective students come to learn what’s in the program.

Tesla’s Autopilot Crash

A meditation on why the very first self-driving car fatality might have happened, and what it meant for the industry.

Self-Driving Car Employers

If you want to work on self-driving cars, here’s who’s hiring, and why.

Udacity Self-Driving Car Nanodegree

I’m building a self-driving car curriculum with Sebastian Thrun! And many other great people and companies.

How to Land an Autonomous Vehicle Job: Coursework

Here are the online courses I took that helped me land a job with Ford’s Autonomous Vehicle team. You, of course, should just enroll in the Udacity Self-Driving Car Engineer Nanodegree Program 🙂

2016 in Review

Looking through my Medium posts from 2016, I wish I had sat down on January 1, 2016, and written predictions of what I thought the year would bring for self-driving cars, so that I could check those predictions now.

I’ll write that kind of post for 2017 this afternoon.

For now, here’s my take on 2016.

January: GM Invests in Lyft

February: Google Causes the First Self-Driving Car Crash

March: GM Buys Cruise

April: Self-Driving Car Companies Lobby for Consistent Federal Regulation

May: Uber Launches Self-Driving Cars in Pittsburgh

June: BMW, Intel, and Mobileye Partner on Self-Driving Cars

July: Delphi Tests Self-Driving Cars in Singapore

August: nuTonomy Launches Self-Driving Cars in Singapore

September: Apple Pulls Back on Self-Driving Cars

October: Tesla Announces Self-Driving Car Plan

November: nuTonomy to Launch Self-Driving Cars in Boston

December: Uber Launches, Then Cancels Self-Driving Cars in San Francisco

I pulled these headlines from Google News. Next year, I hope to see lots more launches. I’d also love to see Udacity make this list in 2017!

Self-Driving Cars and Organ Donation

Slate says: Self-Driving Cars Will Make Organ Shortages Even Worse. This will happen in two ways.

One, about 20% of US organ donations come from car accident victims. Presumably self-driving cars will reduce the number of organs available.

Two, a common place to opt-in to organ donation is at the DMV, while obtaining or renewing a driver’s license. Presumably self-driving cars will reduce the number of people who get driver’s licenses, and thus reduce the number of people who opt-in to organ donation.

The rest of the article is mostly about ways to improve the organ donation system in the US, irrespective of self-driving cars.

But it is an interesting case study a second-order effect autonomous vehicles will have on our world.