Intro to Self-Driving Cars Webinar

Get the inside scoop on how self-driving cars work, and learn about career opportunities in the autonomous vehicle industry!

A few days ago I hosted a webinar with my colleague Jane Sheppard about how self-driving cars work, and about career opportunities in the autonomous vehicle industry. If you’re interested in self-driving cars, you should watch 😉

I talk about business models in the self-driving car space, run down who’s focused on hardware vs. software, and describe all the players in the arena, from the smallest startups to the biggest global organizations (you wouldn’t believe how much money some of the big companies are putting into this technology!). I specifically discuss Tesla’s consumer sales model, and the impact and involvement of ride-sharing companies—and that’s all before I even get to the core technologies that make self-driving cars possible 🚗

If you want to get the low-down on perception, localization, and planning, just hit play below!

“If sensor fusion and computer vision tell you what the world looks like, localization tells you where you are in that world.”

And if you would like to learn about self-driving cars, I recommend the Udacity Intro to Self-Driving Cars Nanodegree Program (for new programmers) or the Self-Driving Car Engineer Nanodegree Program (for intermediate programmers).

See you in the classroom!

Becoming the Industry Standard

When we started the Udacity Self-Driving Car Engineer Nanodegree Program, our goal was to become the industry standard for training self-driving car engineers.

We thought that if we could give people the best education available anywhere, that those students would form the next generation of autonomous vehicle engineers. And they would go out into the industry, and look back to Udacity as a source of talent to hire from themselves.

At that point, the industry would be open to anybody, anywhere, with the desire and passion to learn how self-driving cars work.

Our partners at Mazda just sent over a job description that includes the “Self-Driving Car Nanodegree” as a qualification, and it makes me feel like we’re getting closer to that goal.

The original job description is in Japanese. Here is the English translation.

If you have joined the Nanodegree Program, then you should apply to join the Mazda Co-Pilot team!

Perception Projects from the Self-Driving Car Nanodegree Program

In these projects, students showcase initiative, creativity, and work ethic, as they build projects focused on topics like perception, deep learning, and computer vision.

Students always tell us that perception, deep learning, and computer vision are some of their favorite topics in the Udacity Self-Driving Car Engineer Nanodegree Program.

Their intense interest in these topics translates directly to the high quality of their work. Today, I’d like to share three especially impressive student projects with you that cover these areas!

Implementing YOLO using ResNet as Feature extractor

Mohammad Atif Khan

I love this project! Mohammad did it on his own, and he went way beyond the requirements of the Nanodegree program to do so. That’s going to serve him well in the long run, because employers love it when talented students get so deep into particular subjects, that they start building their own projects to further flesh out their ideas and test their skills.

“In this project I have used a pre-trained ResNet50 network, removed its classifier layers so it becomes a feature extractor, and then added the YOLO classifier layer instead (randomly initialized). I then trained the network on Udacity’s CrowdAI dataset to detect cars in video frames.”

Semantic Segmentation

Enrique Garcia

Enrique used VGG-16 to create a semantic segmentation network for the Advanced Deep Learning project in the Nanodegree program. He trained that network using the KITTI dataset, and then applied the network to scenes he recorded driving in Mexico. Check out his YouTube videos!

“The original FCN-8s was trained in stages. The authors later uploaded a version that was trained all at once to their GitHub repo. The version in the GitHub repo has one important difference: The outputs of pooling layers 3 and 4 are scaled before they are fed into the 1×1 convolutions. As a result, some students have found that the model learns much better with the scaling layers included. The model may not converge substantially faster, but may reach a higher IoU and accuracy.”

Machine Learning for Vehicle Detection

Moataz Elmasry

Moataz built a vehicle detection pipeline combining histogram of oriented gradients, support vector machines, and sliding window search. I particularly like the heatmap he employs to reduce false positives in vehicle detection. This is a great example of going beyond the steps outlined in the Nanodegree program, to build a truly standout project.

“Now given the simplicity of the SVM model, we expect some detections to be false positives. In order to filter out these incorrect detections, one approach is to threshold our positive windows such that we only pick areas where more than one window overlap. In essence we are generating a heatmap of the positive windows.”


Udacity practices project-based learning, which means all of our students in all of our Nanodegree programs build projects like these. This approach enables you to learn practical skills, and to build a dynamic portfolio populated with completed projects that clearly showcase your new skills and experience.

If you’re interested in building amazing projects around perception, deep learning, or any other project related to self-driving cars, you should join our Intro to Self-Driving Cars Nanodegree Program, or our Self-Driving Car Engineer Nanodegree Program!

See you in the classroom, and I can’t wait to see what you build!

Udacity Grads in Self-Driving Car Jobs and Beyond

Since we launched our Self-Driving Car Engineer Nanodegree program in late 2016, nearly 2,000 students have completed the program, and more new graduates are joining them every month.

Not all students enroll in the program specifically to find a new job, but many do, and it’s exciting to see a new generation of talent entering this field. With that in mind, we’d like to introduce you to some of these alums!

One of our early graduates, Robert Ioffe, transitioned within his company, Intel, to the role of Senior Self-Driving Car Software Engineer shortly after enrolling in our Nanodegree Program. Since then, Intel has announced its plan to begin testing 100 self-driving cars in Jerusalem, and eventually in the U.S.

“The coolest thing is that everything I learn in the class is immediately applicable to my current job, which is building a self-driving Range Rover with Intel technology inside. It is very rare where you can learn things in class one day, and the next day you can apply it in your work!”

Megha Maheshwari, who immigrated to the U.S. from India, began her career as a software engineer, but ultimately landed a position at Volvo Cars as an Autonomous Driving, Deep Learning, and Computer Vision Engineer. Volvo is investing heavily in fully electric and self-driving vehicles, which many industry analysts believe is part of its plan to go public.

“When I was ending Term One, I felt I had enough knowledge about classical computer vision and deep learning. That’s when I started looking for jobs and not long after looking and applying, I got hired.”

Udacity grads are also launching startups to capitalize on the growing opportunities in the autonomous vehicle market. After earning their Nanodegree credentials, alums David Hayes and Duncan Iglesias formed the Autonomous Vehicle Organization, or AVO, to increase safety and security by focusing on the vehicle-to-pedestrian (V2P) segment.

Using skills developed in the course, David and Duncan built a semi-autonomous Honda Civic in just 11 days!

Fellow alum Patrick Kern also co-founded a startup, with a different focus. Brighter AI was launched to develop “Deep Natural Anonymization” to help companies comply with new General Data Protection Regulations (GDPR) in Europe.

And just last month, we profiled Han Bin Lee, who teamed up with fellow students he met while working on the Udacity-Didi challenge to start Seoul Robotics. Expanding on the Lidar perception software, they began building during the challenge, and today, the company is already looking for a 3D Vision Researcher and a C++ Software Developer to join their crew in South Korea!

“We’re looking for people with the willingness and ability to learn new concepts and algorithms from the latest research. And we’re a startup, so we need team players, who are able to work effectively within a fast-growing, diverse group of people — we have Korean, Vietnamese, American, and Irish people working with us so far. We are really short of people right now, and we would love to work with fellow Udacity alumni to build this company together!

Because this is still an emerging field, many of our early students didn’t start out wanting to become Self-Driving Car Engineers. That’s changing rapidly as people are realizing how much opportunity there is in this space, but for lifelong learners like Kyle Martin, the decision to enter this field was the culmination of a really interesting journey:

“I started looking for an industry role while I was still in the program. A lead robotics engineer role appeared with a company that was beginning to work on an autonomous shuttle. I jumped at it and got an interview! They were interested in all the areas I’d been working on — things like computer vision and systems architecture. And they were really impressed I’d kept learning and adding to my skills in the program. When they made me an offer, I said “yes” immediately — it sounded like I’d have the opportunity to work on really groundbreaking projects.”

These are just a few examples of how alumni of our autonomous transportation programs are having an impact on this incredible field. As our alumni network continues to grow, we’re excited to help more students find positions in the industry. Stay tuned!

The Story of the Model 3

Bloomberg published a terrific long-form piece last week entitled, “Hell for Elon Musk Is a Midsize Sedan”.

The piece covers everything from Musk’s personal work style, to Tesla’s strategy of vertical integration, to the triumphs and failures on the way to finally hitting their 5,000 cars weekly goal at the end of June. Although the article goes on to question how sustainable that success really is.

“In early June, at Tesla’s annual meeting, Musk tried to project calm, but at times seemed close to tears. “This is like — I tell you — the most excruciatingly hellish several months that I have ever had,” he said, before noting that Tesla’s assembly lines were being further upgraded, making the company “very likely” to hit the weekly goal of 5,000. He also revealed he’d asked employees to build a third general assembly line that would be “dramatically better than Lines 1 and 2.” That sounded even more alien-dreadnoughty.”

I’ve had some difficulty pairing the massive success of the Model 3 as a product with the tremendous manufacturing struggles Tesla has experienced getting the car out the door. This piece helped put that together for me.

Criminal Complaint Reveals Apple Self-Driving Information

This reads like the climax of a James Bond movie. A recently-fired Apple engineer stole secrets from the company, possibly in collusion with a Chinese self-driving car competitor, and was arrested by the FBI at San Jose Airport, just before boarding a plane to China. The Mercury News got the scoop, and kudos to them.

What’s more, the associated “Criminal Complaint” reveals a few previously unknown details of Apple’s secretive self-driving car effort.

https://www.courtlistener.com/recap/gov.uscourts.cand.328942/gov.uscourts.cand.328942.1.0.pdf

The main reveal is the size of the program: 5,000 employees are “disclosed” on the project (out of 135,000 total Apple employees), and 2,700 have been granted access to project databases. That is a big self-driving car team.

The exact nature of the stolen secrets are not disclosed in the complaint, but they seem to involve proprietary circuit boards for sensors.

This is basically the Waymo-Uber lawsuit all over again, plus a dash of international intrigue.

A few years ago, Apple was a highly visible player in the self-driving car world, in spite of their attempted stealth and unwillingness to even acknowledge working on self-driving cars. Recently, Apple has become much less visible, basically because they aren’t demonstrating their cars and attention has moved on to companies that have products to show.

Think about the size of this self-driving car program, though. Then consider that Apple has more self-driving cars in California than any other company. And they have about $300 billion in cash.

Maybe we should be paying more attention to Apple.

I-Pace

The I-Pace, the high-performance electric SUV collaboration between Waymo and Jaguar, is live in San Francisco. So says Clean Technica.

Some news I didn’t notice a few months ago: the partnership between Waymo and Jaguar Land Rover will supposedly be deeper than the Waymo-FCA partnership that has led to the self-driving Pacifica minivan, according to The Verge.

“Waymo and Jaguar Land Rover’s engineers will work in tandem to build these cars to be self-driving from the start, rather than retrofitting them after they come off the assembly line.”

Self-Driving Car Fundamentals: Featuring Apollo

Udacity just launched a free course about self-driving cars with Baidu, the creators of the Apollo open-source self-driving car framework!

“Self-Driving Fundamentals: Featuring Apollo” is a conceptual overview of the key components of a self-driving car and how they work. No math or coding required!

Baidu is one of China’s most important Internet companies, and runs the largest search engine in the largest country in the world. They have also built, Apollo, an open-source self-driving car framework adopted by more and more companies around the globe.

This course is both free and English (with Chinese subtitles, of course). The course consists of 7 lessons:

  • Welcome
  • HD Maps
  • Localization
  • Perception
  • Prediction
  • Planning
  • Control

These are the fundamental components that comprise the autonomous vehicle software stack, and you can learn how they work by following along!

Self-driving cars are truly a global phenomenon, with centers of innovation in North America, Europe, and of course Asia. This course was built with Udacity’s US and China teams, and Baidu’s US and China teams. It is really exciting to watch engineers from around the world work together on some of the most amazing technology mankind has ever produced.

Robots Don’t Hurt Robots, People Hurt Robots

Wired has an amusing article on the difficulty of building self-driving cars, as evidenced by the fact that humans keep crashing into them. The story pegs on a Cruise-on-Cruise collision in which a Cruise safety driver accidentally and manually rear-ended a Cruise vehicle in autonomous mode.

“On June 11, a self-driving Cruise Chevrolet Bolt had just made a left onto San Francisco’s Bryant Street, right near the General Motors-owned company’s garage. Then, whoops: Another self-driving Cruise, this one being driven by a Cruise human employee, thumped into its rear bumper. Yes, very minor Cruise on Cruise violence.”

This prompted me to go digging through the California DMV’s Report of Traffic Collision Involving an Autonomous Vehicle (OL 316). There have been 79 such reports so far. Here are some of the latest:

Fore!

Beware Squirrel

Drifting

Forward!

Fortunately, none of these collisions was especially serious, unlike a few other incidents that have been in the news. But they do serve to highlight just how often human drivers cause collisions. Watch out!