Self-Driving Asteroids

NASA is providing an exploratory grant to a company called Made in Space for the purpose of turning asteroids into autonomous vehicles.

“The objective of this study is for Made In Space (MIS) to establish the concept feasibility of using the age-old technique of analog computers and mechanisms to convert entire asteroids into enormous autonomous mechanical spacecraft. Project RAMA, Reconstituting Asteroids into Mechanical Automata, has been designed to leverage the advancing trends of additive manufacturing (AM) and in-situ resource utilization (ISRU) to enable asteroid rendezvous missions in which a set of technically simple robotic processes convert asteroid elements into very basic versions of spacecraft subsystems (GNC, Propulsion, Avionics).”

And you thought flying cars were the next big thing.

More information here.

Robots Don’t Hurt Robots, People Hurt Robots

Wired has an amusing article on the difficulty of building self-driving cars, as evidenced by the fact that humans keep crashing into them. The story pegs on a Cruise-on-Cruise collision in which a Cruise safety driver accidentally and manually rear-ended a Cruise vehicle in autonomous mode.

“On June 11, a self-driving Cruise Chevrolet Bolt had just made a left onto San Francisco’s Bryant Street, right near the General Motors-owned company’s garage. Then, whoops: Another self-driving Cruise, this one being driven by a Cruise human employee, thumped into its rear bumper. Yes, very minor Cruise on Cruise violence.”

This prompted me to go digging through the California DMV’s Report of Traffic Collision Involving an Autonomous Vehicle (OL 316). There have been 79 such reports so far. Here are some of the latest:

Fore!

Beware Squirrel

Drifting

Forward!

Fortunately, none of these collisions was especially serious, unlike a few other incidents that have been in the news. But they do serve to highlight just how often human drivers cause collisions. Watch out!

Baidu Apolong Buses

Baidu is launching self-driving shuttle buses running their Apollo open-source self-driving software. The bus model is called “Apolong”, which I assume is a portmanteau of “Apollo” and “King Long”, the Chinese vehicle manufacturer.

At the Baidu Create 2018 conference, Baidu CEO Robin Li announced the autonomous shuttles will launch this year in China and early next year in Japan.

“2018 marks the first year of commercialization for autonomous driving. From the volume production of Apolong, we can truly see that autonomous driving is making great strides — taking the industry from zero to one.”

The autonomous vehicle market might come to resemble some aspects of the mobile phone market, but this time with Google (technically, Alphabet/Waymo) controlling the closed ecosystem.

Baidu says it will support at least four companies’ computational units: Intel, NVIDIA, NXP, and Renesas. That’s part of the larger group of 100+ partners they’ve signed up.

The New Udacity Self-Driving Car Engineer Nanodegree Program Syllabus

A focus on fundamental skills in each core area of the self-driving car stack.

Over 12,000 students have enrolled in Udacity’s Self-Driving Car Engineer Nanodegree Program, and many of them are now working in the autonomous vehicle industry.

These successes have taught us a great deal about what you need to know in order to accomplish your goals, and to advance your career. In particular, we’ve learned that by narrowing the breadth of the program, and expanding opportunities to go deep in specific areas, we can better offer a path that is expressly tailored to support your career journey.

To that end, we’re updating the curriculum for the program to focus on fundamental skills in each core area of the self-driving car stack. I’d like to share some details with you about this important update, and about the changes we’ve made.

Term 1

Introduction

  1. Welcome
    In our introduction, you’ll begin by meeting your instructors — Sebastian Thrun, Ryan Keenan, and myself. You’ll learn about the systems that comprise a self-driving car, and the structure of the program as a whole.
  2. Workspaces
    Udacity’s new in-browser programming editor moves you straight to programming, and past any challenges related to installing and configuring dependencies.

Computer Vision

  1. Computer Vision Fundamentals
    Here, you’ll use OpenCV image analysis techniques to identify lines, including Hough transforms and Canny edge detection.
  2. Project: Detect Lane Lines
    This is really exciting—you’ll detect highway lane lines from a video stream in your very first week in the program!
  3. Advanced Computer Vision
    This is where you’ll explore the physics of cameras, and learn how to calibrate, undistort, and transform images. You’ll study advanced techniques for lane detection with curved roads, adverse weather, and varied lighting.
  4. Project: Advanced Lane Detection

EIn this project, you’ll detect lane lines in a variety of conditions, including changing road surfaces, curved roads, and variable lighting. You’ll use OpenCV to implement camera calibration and image transforms, as well as apply filters, polynomial fits, and splines.

Deep Learning

  1. Neural Networks
    Here, you’ll survey the basics of neural networks, including regression, classification, perceptrons, and backpropagation.
  2. TensorFlow
    Next up, you’ll train a logistic classifier using TensorFlow. And, you’ll implement related techniques, such as softmax probabilities and regularization.
  3. Deep Neural Networks
    This is where you’ll combine activation functions, backpropagation, and regularization, all using TensorFlow.
  4. Convolutional Neural Networks
    Next, you’ll study the building blocks of convolutional neural networks, which are especially well-suited to extracting data from camera images. In particular, you’ll learn about filters, stride, and pooling.
  5. Project: Traffic Sign Classifier

For this project, you’ll implement and train a convolutional neural network to classify traffic signs. You’ll use validation sets, pooling, and dropout to design a network architecture and improve performance.

  1. Keras
    This will be your opportunity to build a multi-layer convolutional network in Keras. And, you’ll compare the simplicity of Keras to the flexibility of TensorFlow.
  2. Transfer Learning
    Here, you’ll fine tune pre-trained networks to apply them to your own problems. You’ll study cannonical networks such as AlexNet, VGG, GoogLeNet, and ResNet.
  3. Project: Behavioral Cloning

For this project, you’ll architect and train a deep neural network to drive a car in a simulator. You’ll collect your own training data, and use it to clone your own driving behavior on a test track.

Career Development

  1. GitHub
    For this career-focused project, you’ll get support and guidance on how to polish your portfolio of GitHub repositories. Hiring managers and recruiters will often explore your GitHub portfolio before an interview. So it’s important to create a professional appearance, make it easy to navigate, and ensure it showcases the full measure of your skills and experience.

Sensor Fusion

Our terms are broken out into modules, which are in turn comprised of a series of focused lessons. This Sensor Fusion module is built with our partners at Mercedes-Benz. The team at Mercedes-Benz is amazing. They are world-class automotive engineers applying autonomous vehicle techniques to some of the finest vehicles in the world. They are also Udacity hiring partners, which means the curriculum we’ve developed is expressly designed to nurture and advance the kind of talent they’re eager to hire!

  1. Sensors
    The first lesson of the Sensor Fusion Module covers the physics of two of the most import sensors on an autonomous vehicle — radar and lidar.
  2. Kalman Filters
    Kalman filters are a key mathematical tool for fusing together data. You’ll implement these filters in Python to combine measurements from a single sensor over time.
  3. C++ Checkpoint
    This is a chance to test your knowledge of C++ to evaluate your readiness for the upcoming projects.
  4. Geometry and Trigonometry
    Before advancing further, you’ll get a refresh on your knowledge of the fundamental geometric and trigonometric functions that are necessary to model vehicular motion.
  5. Extended Kalman Filters
    Extended Kalman Filters (EKFs) are used by autonomous vehicle engineers to combine measurements from multiple sensors into a non-linear model. First, you’ll learn the physics and mathematics behind vehicular motion. Then, you’ll combine that knowledge with an extended Kalman filter to estimate the positions of other vehicles on the road.
  6. Project: Extended Kalman Filters in C++

For this project, you’ll use data from multiple sensors to track a vehicle’s motion, and estimate its location with precision. Building an EKF is an impressive skill to show an employer.

Term 2

Localization

This module is also built with our partners at Mercedes-Benz, who employ cutting-edge localization techniques in their own autonomous vehicles. Together we show students how to implement and use foundational algorithms that every localization engineer needs to know.

  1. Introduction to Localization
    In this intro, you’ll study how motion and probability affect your understanding of where you are in the world.
  2. Markov Localization
    Here, you’ll use a Bayesian filter to localize the vehicle in a simplified environment.
  3. Motion Models
    Next, you’ll learn basic models for vehicle movements, including the bicycle model. You’ll estimate the position of the car over time given different sensor data.
  4. Particle Filter
    Next, you’ll use a probabilistic sampling technique known as a particle filter to localize the vehicle in a complex environment.
  5. Implementation of a Particle Filter
    To prepare for your project, you’ll implement a particle filter in C++.
  6. Project: Kidnapped Vehicle

For your actual project, you’ll implement a particle filter to take real-world data and localize a lost vehicle.

Planning

  1. Search
    First, you’ll learn to search the environment for paths to navigate the vehicle to its goal.
  2. Prediction
    Then, you’ll estimate where other vehicles on the road will be in the future, utilizing both models and data.
  3. Behavior Planning
    Next, you’ll model your vehicles behavior choices using a finite state machine. You’ll construct a cost function to determine which state to move to next.
  4. Trajectory Generation
    Here, you’ll sample the motion space, and optimize a trajectory for the vehicle to execute its behavior.
  5. Project: Highway Driving

For your project, you’ll program a planner to navigate your vehicle through traffic on a highway. Pro tip: Make sure you adhere to the speed, acceleration, and jerk constraints!

Control

  1. Control
    You’ll begin by build control systems to actuate a vehicle to move it on a path.
  2. Project: PID Control

Then, you’ll implement the classic closed-loop controller — a proportional-integral-derivative control system.

Career Development

  1. Build Your Online Presence
    Here, you’ll continue to develop your professional brand, with the goal of making it easy for employers to understand why you are the best candidate for their job.

System Integration

  1. Autonomous Vehicle Architecture
    Get ready! It’s time to earn the system architecture of Carla, Udacity’s own self-driving car!
  2. Introduction to ROS
    Here, you’ll navigate Robot Operating System (ROS) to send and receive messages, and perform basic commands.
  3. Packages & Catkin Workspaces
    Next, you’ll create and prepare an ROS package so that you are ready to deploy code on Carla.
  4. Writing ROS Nodes
    The, you’ll develop ROS nodes to perform specific vehicle functions, like image classification or motion control.
  5. Project: Program an Autonomous Vehicle

Finally, for your last project, you’ll deploy your teams’ code to Carla, a real self-driving car, and see how well it drives around the test track!

  1. Graduation
    Congratulations! You did it!

By structuring our curriculum in this way, we’re able to offer you the opportunity to master critical skills in each core area of the self-driving car stack. You’ll establish the core foundations necessary to launch or advance your career, while simultaneously preparing yourself for more specialized and advanced study.

Ready? Let’s drive!

Virtual Reality and Self-Driving Cars

For years I’ve heard whispers about virtual reality inside of self-driving cars, supposedly incubated as quiet projects inside of secretive companies. Variety just delivered a small scoop in that vein, by digging up a couple of Apple patent applications related to virtual reality inside of self-driving cars.

Variety quotes from one of the patent applications:

“For example, a virtual representation of an author or talk show host may appear to be sitting in the seat next to the passenger; the virtual author may be reading one of their books to the passenger, or the virtual talk show host may be hosting their show from the seat next to the passenger, with their voices provided through the audio system. As another example, the passenger may experience riding on a flatbed truck with a band playing a gig on the flatbed, with the band’s music provide (sic) through the audio system.”

An extra wrinkle in the article is that the name on a couple of these patent applications is Mark Rober. Rober was previously unknown to me, but he has a YouTube channel with 3.5 million subscribers and videos like Lemon Powered Supercar, and World’s LARGEST NERF GUN.

Regardless of the patents’ author, a more prosaic description comes in the first sentence of one of the patent summaries:

“A VR system for vehicles that may implement methods that address problems with vehicles in motion that may result in motion sickness for passengers.”

I’ve written in the past about my difficulty doing serious cognitive work in a moving vehicle, even though I’m not driving. There are many ways to address this: improve the quality of roads, improve the quality of motion control through vehicular software, better mechanical suspension systems, possibly flying cars, and of course virtual reality.

Of these options, virtual reality seems less likely to me, because my impression is that virtual reality tends to induce motion sickness even in stationary situations. There is an argument presented in the patent that a moving vehicle in some ways mitigates the motion sickness problem, but my intuition is that this seems unlikely.

I would be excited to be proven wrong, though, and would love to try it out.

Driverless Grocery Delivery

Grocery delivery seems like one of the obvious applications of self-driving vehicles, but I wouldn’t have guessed that Kroger would be out front on it.

“To start out, Nuro will use a fleet of self-driving test vehicles with human safety drivers to make deliveries for Kroger’s grocery stores. Customers can track and interact with the vehicles via a Nuro app or Kroger’s pre-existing online delivery platform. But if Nuro’s human test drivers don’t get out to help you, don’t be mad because in our driverless future, we all need to pitch in and unload our own groceries.”

I think the tipping point with autonomous vehicles will be when boring, non-tech businesses like groceries can put them to work without having develop their own fleet. So far, most self-driving car companies have either been in the mobility space to begin with, or had to dive into the mobility space as part of developing that business line.

If Kroger can utilize self-driving cars as a service, without having to build self-driving cars themselves, that would be a big step forward.

Small World

Our friends and former Udacity colleagues at Voyage just announced the addition of Drew Gray as their CTO. Drew is one of the key contributors to building the Udacity Self-Driving Car Engineer Nanodegree Program, and has taught both our students and myself so much. Any company would be lucky to have Drew, so this is a huge hire for Voyage.

In just a few years, the self-driving car industry has seen so much movement, and in many ways it still feels like a small community that is poised for tremendous growth.

At Udacity, we are fortunate to be right in the middle of this. We met Drew right after he’d joined Otto from Cruise, and then worked with him at Uber and now Voyage. Our partners at NVIDIA and Mercedes-Benz have hired many of our graduates, and we see them regularly at conferences and industry events.

Udacity students work at every company I can think of the in the autonomous vehicle industry, from manufacturers like BMW and Ford to suppliers like Bosch to ride-sharing companies like Lyft to startups like Parkopedia and Phantom Auto.

It’s a lot of fun to be part of this community and grow with it.

Michigan Central Station

A couple of miles from downtown Detroit, in the Corktown neighborhood, stands Michigan Central Station. At 18 stories, it is a grand building that has been vacant and deteriorating for decades.

Ford Motor Company just announced that they have purchased the building and surrounding land, with plans to turn it into a campus for their autonomous vehicle and mobility teams.

The plans call for all of this work to complete by 2022, so it will be a while coming, but the vision is big and I’m looking forward to seeing it unfold.

Full Self-Driving Tesla Features

Elon Musk broke the Internet a few days ago with a tweet promising that in August Tesla “will begin to enable full self-driving features.”

I am pretty excited about this but it is also worth noting that this verbiage is vague enough to drive a Tesla semi through.

“Full self-driving features” presumably means something beyond current Tesla Autopilot, but short of “full self-driving”. What are the features that make up full self-driving?

Some ideas include end-to-end routing, even if drivers still have to pay attention to the road. Or even just automatic lane change decision and execution on a highway.

Potentially this could mean that Tesla is enabling drivers to stop paying attention to the road under certain scenarios, although that seems unlikely, given the recent spate of crashes.

It will be exciting to see what, if anything, comes of this. But Elon Musk himself warns us that these tweets are not well-thought out strategy, but rather off-the-cuff remarks: