Listening to Tim Harford on a back episode of EconTalk, I was struck by their extended discussion of the elevator as one of the original autonomous vehicles.
The discussion seems both distant from self-driving cars, but also strangely relevant. Harford and EconTalk host Russ Roberts touch on safety, technological unemployment, and traffic optimization.
Udacity Self-Driving Car Engineer Nanodegree program students are taking their newly-mastered skills into the broader world, and their projects are incredible!
The talent and passion of students in Udacity’s Self-Driving Car Engineer Nanodeegree Program regularly astounds me. Here are five independent projects that students did outside of the program to build their skills as autonomous vehicle engineers.
Check out the autonomous hardware package strapped to the top of this tiny red range rover! And the various test track configurations it navigates. Super cool.
Spatial Transformers are modules that can be inserted into convolutional neural network architectures to focus the network on the most important object in the image. This is helpful because scale and rotation make object localization (finding an object within an image) a complex problem.
“The STN is a differentiable module which can be injected in a convolutional neural network. The default choice is to place it right “after” the input layer to make it learn the best transformation matrix theta which minimizes the loss function of the main classifier (in our case, this is IDSIA).”
This is an awesome four-part series on building a miniature self-driving car from scratch, with a big emphasis on hardware and electrical engineering. Part 1 is ROS setup, Part 2 is the sensor suite, Part 3 is the microcontroller, and Part 4 is working with the NVIDIA Jetson TX1. This is quite the hacker project.
“The goal of this project is to build an autonomous base that can navigate the sidewalks of my subdivision. It will use GPS, LIDAR, and other sensors to navigate to GPS way points, avoid obstacles, and return to the start position.”
I love watching videos that students shoot themselves. Here Karol is applying a Single-Shot Detector (SSD) network to identify other vehicles on the road.
Scaling up from miniature self-driving cars to human-sized self-driving cars, Bogdan outlines a self-driving car development platform accessible for under US$10,000. This does not include the sensor suite — just the drive-by-wire platform. He settled on the Renault Twizy and is looking for partners to work on this with him 🙂
“One of the main challenges in the self-driving car industry (among other things like technology itself, policy updates, ethical issues, etc.) is the barrier of entry. If you are a small start up building a local autonomous delivery service or a single engineer trying out latest deep learning approaches for car/traffic sign detection, it is incredible hard (sometimes even impossible) to get things off the ground and test your solution in the real world setting.”
This actually appears to have nothing to do (at least directly) with self-driving cars. Rather, Intel is ramping up chip production, which is a capital-intensive process.
However, this line from the article caught my eye: “Israel traditionally competes with Ireland in benefits offered to Intel in exchange for investment.”
I had never really thought of that, but I’m sure it’s a fact of life for technology executives in both countries. Two small, somewhat isolated, highly-educated, technology-focused countries on opposite edges of Europe, with strong ethnic and expatriate connections to the United States. Of course Israel traditionally competes with Ireland. Now that I think about it, they seem like practically the same country.
And this is interesting because Israel has such a dynamic autonomous vehicle industry. Mobileye, of course, but also research centers for many automotive manufacturers and suppliers, and a cluster of autonomous vehicle startups.
Ireland has been less active in the autonomous vehicle market, but if you believe the theory that Ireland and Israel are practically the same country, then presumably the autonomous vehicle industry is coming to Ireland.
“A fleet of Hyundai Motor Company’s next generation fuel cell electric cars have succeeded in completing a self-driven 190 kilometers journey from Seoul to Pyeongchang. This is the first time in the world that level 4 autonomous driving has been achieved with fuel cell electric cars, the ultimate eco-friendly vehicles.”
The idea that self-driving trucks will actually boost the number of driver jobs is not new to me. However, the recent cross-country trip by self-driving truck startup Embark got me thinking about it.
The Embark drive was only a Level 2 endeavor, and it seems like there were multiple disengagements, but the days of Level 4 trucking on the highway seem near.
Embark’s model is to have autonomous vehicles drive from hub-to-hub on the highways, and human drivers handle the last mile deliveries.
“The autonomous trucks would haul trailers from hub to hub on the freeway, but local drivers would continue to handle the more complex driving tasks associated with the beginning and end of each trip — from origin to highway and from highway to final destination.”
It’s at least plausible that this would result in a net increase in driving jobs, if long-haul costs dropped so dramatically that interstate commerce surges.
Enjoy a look at some of the projects our students are building, including Finding Lane Lines, Traffic Sign Classifier, Behavioral Cloning, and more!
Students in our Self-Driving Car Engineer Nanodegree program engage in a project-based curriculum, and from the moment they enroll, they begin addressing key challenges and topics through building specialized projects. Here are all of the projects they build!
Finding Lane Lines
This is the first project students complete, one week into the program.
They learn to work with images, color spaces, thresholds, and gradients, in order to find lane lines on the road. Stack: Python, NumPy, OpenCV
Traffic Sign Classifier
In this project, students train a convolutional neural network to classify traffic signs.
To do so, they use the German Traffic Sign Recognition Benchmark dataset. This particular student went above and beyond to train his network to not only classify signs, but also localize them within the image, and applied his classifier to a video. Stack: Python, NumPy, TensorFlow
Behavioral Cloning
Here, students record training data by manually driving a car around a track in a simulator.
Then they use this camera, steering, and throttle data to train an end-to-end neural network for driving the vehicle, based on NVIDIA’s famous research paper. Stack: Python, NumPy, Keras
Advanced Lane Finding
By applying advanced computer vision techniques, such as sliding window tracking, to a dashcam video, students are able to track lane lines on the road under a variety of challenging conditions. Stack: Python, NumPy, OpenCV
Vehicle Detection and Tracking
Students use machine learning techniques and feature extraction to identify and track vehicles on a highway. Stack: Python, NumPy, scikit-learn, OpenCV
Extended Kalman Filter
An extended Kalman filter merges noisy simulated radar and lidar data to track a vehicle. Stack: C++, Eigen
Unscented Kalman Filter
An unscented Kalman filter merges noisy, highly non-linear simulated radar and lidar data to track a vehicle. Stack: C++, Eigen
Kidnapped Vehicle
Students develop a particle filter in C++ to probabilistically determine a vehicles location relative to a sparse landmark map. Stack: C++
PID Controller
Students build and tune a proportional-integral-derivative controller to steer a vehicle around a test track, following a target trajectory. Stack: C++
Model Predictive Control
Students build and optimize a model predictive controller to steer a vehicle around a test track, following a target trajectory. Stack: C++, ipopt
Path Planning
In this project, students construct a path planner for highway driving based on a finite state machine.
The planner has three components: environmental prediction, maneuver selection, and trajectory generation. Stack: C++
Semantic Segmentation
Students train a pixel-wise segmentation network that identifies and colors road pixels to identify free space for driving. Stack: Python, TensorFlow
Safety Case
Students build a prototype of a safety case for a lane-keeping assistance ADAS feature, including the safety plan, hazard analysis and risk assessment, functional safety concept, technical safety concept, and software requirements.
Programming a Real Self-Driving Car
For this project, students form teams to drive a real self-driving car around the Udacity test track.
The car is required to negotiate a traffic light and follow a waypoint trajectory. Code is built first in the simulator, and then deployed to Udacity’s self-driving car in California. Stack: Python, ROS, Autoware, TensorFlow
Six months ago I wound up on a plane next to an executive from BYTON, an autonomous vehicle startup targeting the Chinese market.
At the time, I was unfamiliar with the company. But since then, BYTON has appeared more in the press. Most recently, they announced a partnership with Chris Urmson’s Aurora to power the autonomous stack in BYTON vehicles.
BYTON is notable for a few things.
First, the company seems to be a hybrid of China, Europe, and Silicon Valley, with leaders coming from all three locations. I wonder if more startups will be organized that way in the future.
Third, they are betting big on China’s electric vehicle mandate. The exact number or percent of vehicles that must be electric is a little hard to pin down, but it seems to be on the order of ten percent by 2020. BYTON is presumably hoping that being an electric-first vehicle company will give them an advantage.
To our friends at Alphabet: we are partners, you are an important investor in Uber, and we share a deep belief in the power of technology to change people’s lives for the better. Of course, we are also competitors. And while we won’t agree on everything going forward, we agree that Uber’s acquisition of Otto could and should have been handled differently.
My wife’s car is finally showing its age. At 14 years and 216,000 miles, it’s starting to remind us that it needs replaced.
So we’ve been car shopping, which has called my attention to the gap between the self-driving cars the world is gearing up for, and what’s actually out on the market.
Some major brands just got their first adaptive cruise control this year. In other cases, in order to get ADAS features, you have to step all the way up to the highest trim level.
Want an Autopilot-like experience? There are a few options, but they’re all on cars that sell for $60,000-plus.
I’m excited for where self-driving cars are taking us, and how quickly we are getting there. But going car shopping illustrates how far this must seem to people who don’t work in the space very day.