Navimotive Conference

Next Saturday, September 15, I am excited to speak at Navimotive, organized by Intellias, in Kiev, Ukraine!

Self-driving cars really are becoming a world-wide phenomenon, with contributions across the entire world. Intellias is a Ukranian automotive supplier that works with companies like Volkswagen, HERE, and Yandex. There will also be speakers from GlobalLogic, CloudMade, MAPS.ME, and more.

I’ll talk about Carla, Udacity’s self-driving car, and share some information about upcoming Nanodegree programs we’re building.

You should come, too!

If you are a Udacity student in the area, please send me an email (david.silver@udacity.com) and I’d love to arrange a meetup.

Experience Self-Driving Trucks in India with Flux Auto

See the prototype in action—10km of autonomous highway driving!

One of the great joys of teaching self-driving cars at Udacity is watching the amazing things our students build. I’m based in Silicon Valley, but our students come from all over the world, and I have the opportunity on a daily basis to experience the truly global nature of this transportation revolution.

Case in point: Flux Auto.

One of our Udacity self-driving car graduates, Shilpaj Bhalerao, is the tech lead for a small team building self-driving trucks in India. This is no small feat. India is even more of a challenge for self-driving cars than other parts of the world, due to the complexity of traffic norms, and the infrastructure quality. So it’s super-exciting to watch their 10km autonomous trucking run in action.

Check it out.

No matter where you are in the world, if your dream is to contribute to the future of autonomous transportation, then you can acquire the skills you need with Udacity. Check out our School of Autonomous Systems today.

Ford’s Self-Driving Vehicle Approach

A few weeks ago, Ford Motor Company published “A Matter of Trust”, a report outlining their autonomous vehicle development process. The document seems pretty clearly intended for government regulators, and indeed they sent the report to Transportation Secretary Elain Chao.

Compared to the Waymo Safety Report released last year, the Ford document is a little more dense, but broadly similar. Both documents discuss how autonomous vehicles work, what sensors are involved, and what processes are in place to ensure safety.

Ford’s document includes more detail about training and procedures for safety operators, as you might expect in the wake of the Uber ATG collision in Arizona.

Ford also share a bit more information about security. The report mentions efforts related to authentication, network segmentation, and physical partitioning.

All in all, it’s a good read to see how a larger autonomous vehicle organization operates.

Throughput and Latency and Self-Driving Cars

For the past few months I have been writing about self-driving cars for Forbes.com. My latest post is on the potential for self-driving cars to take throughput (road capacity) and latency (speed) well beyond the plateau at which human drivers stop improving.

“Driving around Naples at 25mph for a year is a triumph, but imagine a self-driving car that can navigate Naples, safely, at 40mph? Or faster? Equipped with an array of sensors and super-computers, self-driving vehicles could react quicker and achieve safer speeds well beyond the ability of human drivers.

Similarly, US highway capacity tops out at about 2,000 vehicles per hour, assuming human drivers. Self-driving (and potentially connected) cars, however, may be able to dramatically decrease the distance between vehicles, and even move in unison with other cars. Think of a pack of bumper-to-bumper race cars traveling at 100mph. The improvement in road capacity would be tremendous.”

Read the whole thing, and also check out the Benedict Evans post on “Tesla, software and disruption” that sparked this line of thought.

Self-Driving Lyfts in Las Vegas

A few weeks ago Lyft and Aptiv announced their 5000th paid self-driving car ride in Las Vegas. The Lyft blog announcement quotes Raj Kapoor: “Lyft is the largest network currently deploying a commercial self-driving program to the public.”

I believe that is correct and may, in fact, slightly understate Lyft’s position. Lyft is one of the only companies offering self-driving rides to the general public, since Uber halted its autonomous vehicle testing program earlier this year.

Waymo and several other companies have self-driving car pilots available to pre-screened participants, but few companies are currently opening their network to any member of the general public.

Earlier this year, AAA was running a self-driving NAVYA shuttle through downtown Las Vegas, although the website currently implies the shuttle is offline and will return later this month.

Las Vegas has long been a mecca for self-driving cars for a few reasons:

  • The weather is sunny
  • The streets are wide and rectilinear
  • The Consumer Electronics Show in January has prompted companies to run demos in the city already

Waymo announced earlier this year that its Phoenix-area self-driving cars would open to the general public at some point in 2018.

But, for now, the best place for you and me and almost anybody to try out a self-driving car is Las Vegas.

Join Me On Thursday For Our Online Open House For Udacity’s School Of Autonomous Systems!

This Thursday, August 23rd, at 9am Pacific Time, I will be hosting an online open house for Udacity’s School of Autonomous Systems. RSVP now to join me, my Udacity colleagues, our alumni, and other potential students, and learn about our many exciting programs!

The School of Autonomous Systems is the home for Udacity’s Self-Driving Car, Flying Car, Robotics, and Intro to Self-Driving Cars Nanodegree Programs.

At the Open House, I’ll share an overview of each program, compare them, and describe the careers for which each option prepares you.

We’ll finish with a live question-and-answer session, co-hosted by myself and several of my Udacity instructional colleagues, to answer as many of your questions as we can.

RSVP now to join us on Thursday! And don’t worry if you can’t make it! RSVP via the link, and we’ll send you the recording of the open house so you don’t miss out!

Self-Driving Fundamentals: Lesson 1

Discover why our free “Self-Driving Fundamentals: Featuring Apollo” course is such an ideal starting point for anyone new to the field of autonomous systems.

Self-Driving Fundamentals: Featuring Apollo is a terrific, free introduction to how self-driving cars work, through the lens of the Apollo open-source self-driving car project.

It’s perfect for beginners who are in the exploratory phase of their autonomous systems journey, and it does a great job demonstrating how existing basic programming and data skills can be applied to this field.

Upon successfully completing the program, you’ll be ready to combine your newly-acquired self-driving car fundamentals with your existing programming skills to enroll in our Intro to Self-Driving Cars Nanodegree program.

The course begins with an overview of self-driving cars, and how they work. From there, we dive into Apollo and it’s multi-layer architecture: reference vehicle, reference hardware, open-source software, and cloud services.

Self-Driving Fundamentals: Featuring Apollo is led by the leaders of the Apollo project, Udacity founder Sebastian Thrun, and me!

This is a great opportunity to learn the key parts of self-driving cars, and get to know the Apollo architecture. You’ll utilize Apollo HD maps, localization, perception, prediction, planning and control, and start learning the fundamentals of how to build self-driving cars.

And it’s free! You should learn about self-driving cars with us 🚗

Intro to Self-Driving Cars Webinar

Get the inside scoop on how self-driving cars work, and learn about career opportunities in the autonomous vehicle industry!

A few days ago I hosted a webinar with my colleague Jane Sheppard about how self-driving cars work, and about career opportunities in the autonomous vehicle industry. If you’re interested in self-driving cars, you should watch 😉

I talk about business models in the self-driving car space, run down who’s focused on hardware vs. software, and describe all the players in the arena, from the smallest startups to the biggest global organizations (you wouldn’t believe how much money some of the big companies are putting into this technology!). I specifically discuss Tesla’s consumer sales model, and the impact and involvement of ride-sharing companies—and that’s all before I even get to the core technologies that make self-driving cars possible 🚗

If you want to get the low-down on perception, localization, and planning, just hit play below!

“If sensor fusion and computer vision tell you what the world looks like, localization tells you where you are in that world.”

And if you would like to learn about self-driving cars, I recommend the Udacity Intro to Self-Driving Cars Nanodegree Program (for new programmers) or the Self-Driving Car Engineer Nanodegree Program (for intermediate programmers).

See you in the classroom!

Becoming the Industry Standard

When we started the Udacity Self-Driving Car Engineer Nanodegree Program, our goal was to become the industry standard for training self-driving car engineers.

We thought that if we could give people the best education available anywhere, that those students would form the next generation of autonomous vehicle engineers. And they would go out into the industry, and look back to Udacity as a source of talent to hire from themselves.

At that point, the industry would be open to anybody, anywhere, with the desire and passion to learn how self-driving cars work.

Our partners at Mazda just sent over a job description that includes the “Self-Driving Car Nanodegree” as a qualification, and it makes me feel like we’re getting closer to that goal.

The original job description is in Japanese. Here is the English translation.

If you have joined the Nanodegree Program, then you should apply to join the Mazda Co-Pilot team!

Perception Projects from the Self-Driving Car Nanodegree Program

In these projects, students showcase initiative, creativity, and work ethic, as they build projects focused on topics like perception, deep learning, and computer vision.

Students always tell us that perception, deep learning, and computer vision are some of their favorite topics in the Udacity Self-Driving Car Engineer Nanodegree Program.

Their intense interest in these topics translates directly to the high quality of their work. Today, I’d like to share three especially impressive student projects with you that cover these areas!

Implementing YOLO using ResNet as Feature extractor

Mohammad Atif Khan

I love this project! Mohammad did it on his own, and he went way beyond the requirements of the Nanodegree program to do so. That’s going to serve him well in the long run, because employers love it when talented students get so deep into particular subjects, that they start building their own projects to further flesh out their ideas and test their skills.

“In this project I have used a pre-trained ResNet50 network, removed its classifier layers so it becomes a feature extractor, and then added the YOLO classifier layer instead (randomly initialized). I then trained the network on Udacity’s CrowdAI dataset to detect cars in video frames.”

Semantic Segmentation

Enrique Garcia

Enrique used VGG-16 to create a semantic segmentation network for the Advanced Deep Learning project in the Nanodegree program. He trained that network using the KITTI dataset, and then applied the network to scenes he recorded driving in Mexico. Check out his YouTube videos!

“The original FCN-8s was trained in stages. The authors later uploaded a version that was trained all at once to their GitHub repo. The version in the GitHub repo has one important difference: The outputs of pooling layers 3 and 4 are scaled before they are fed into the 1×1 convolutions. As a result, some students have found that the model learns much better with the scaling layers included. The model may not converge substantially faster, but may reach a higher IoU and accuracy.”

Machine Learning for Vehicle Detection

Moataz Elmasry

Moataz built a vehicle detection pipeline combining histogram of oriented gradients, support vector machines, and sliding window search. I particularly like the heatmap he employs to reduce false positives in vehicle detection. This is a great example of going beyond the steps outlined in the Nanodegree program, to build a truly standout project.

“Now given the simplicity of the SVM model, we expect some detections to be false positives. In order to filter out these incorrect detections, one approach is to threshold our positive windows such that we only pick areas where more than one window overlap. In essence we are generating a heatmap of the positive windows.”


Udacity practices project-based learning, which means all of our students in all of our Nanodegree programs build projects like these. This approach enables you to learn practical skills, and to build a dynamic portfolio populated with completed projects that clearly showcase your new skills and experience.

If you’re interested in building amazing projects around perception, deep learning, or any other project related to self-driving cars, you should join our Intro to Self-Driving Cars Nanodegree Program, or our Self-Driving Car Engineer Nanodegree Program!

See you in the classroom, and I can’t wait to see what you build!