Radar-Based High-Definition Maps

Automotive News has a cool article about different advances in automotive radar for autonomous vehicles. The article makes the point that radar has been a little bit like the forgotten sensor:

“Perhaps radar is even underappreciated. Venture capital has flowed into lidar and camera-based solutions for automated vehicles; radar has been viewed as a commodity.”

That seems right to me.

The article highlights three companies working on different approaches to more advanced radar for self-driving cars. The work from Bosch to create radar-based high-definition maps seems particularly interesting.

“By coupling these two inputs [radar and GPS], Bosch’s system can take that real-time data and compare it to its base map, match patterns between the two, and determine its location with centimeter-level accuracy.”

Bosch calls this approach, “radar road signature” and posits that it can provide centimeter-level accuracy while using half as much data as a camera-based map.

Bosch is highlighting their work with TomTom to build radar-based HD maps. They divide these maps into three layers:

  • Localization: calculating where the car is in the world
  • Planning: deciding which actions are available to the car
  • Dynamic: predicting what other actors in the environment will do

This is exciting work because high-definition (HD) maps are usually the domain of lidar. Lidar point clouds are used to generate maps against which a vehicle can compare later sensor readings.

Some work has gone into attempts to build such maps with camera data. Visual SLAM is one example of this. By comparison, relatively little work has gone into building HD maps from radar data. That makes Bosch’s endeavor novel and exciting.

Bosch is positioning this as a fleet-based mapping system, with map data generated by ordinary consumer cars, not necessarily specialized mapping vehicles. It’s hard to know how realistic that really is, but it would play to Bosch’s strength of scale.

“One million vehicles will keep the high-resolution map up to date.”

As the world’s largest automotive supplier, Bosch has a unique ability to pump a success into the automotive market.

Robots On Foot Are Harder Than Robots on Wheels

Ford CTO Ken Washington, who used to be like my boss’s boss’s boss’s boss’s boss when I was at Ford, and seems like a great guy, has a post up about Digit, a humanoid robot that Ford is working on for last-mile deliveries.

https://www.youtube.com/watch?v=WHWciIxNK2c

Reading the post and watching the video, I have a few reactions:

  1. This is awesome.
  2. This will be insanely hard.
  3. Giving a robot a lidar for a head is a stroke of genius, at least from an aesthetic perspective.

Ford is completely right that the last-mile (really, last-ten-yards) delivery problem is going to be a huge issue. Right now logistics companies rely on drivers to both operate a vehicle and to walk deliveries to customers’ front doors. Self-driving cars solve the first problem, but in a lot of cases that won’t ultimately have much of an impact if we can’t solve the second task.

So the motivation for Digit is spot-on.

But walking robots are bananas-level difficult.

Look no further than this video with the awesome title, “A Compilation of Robots Falling Down at the DARPA Robotics Challenge”:

Granted, this video is from 4 years ago and progress has been made, but my impression is that walking robots make self-driving cars look like an easy problem.

I remember taking an edX course from Russ Tedrake at MIT called “Underactuated Robotics” that was concerned with, among other things, walking robots. This course was so, so hard. The control problems inherent in a multi-joint, walking robot are of a staggering level of mathematical complexity.

Digit’s demo video is awesome, but we’ve all learned to be skeptical of demo videos. If Ford, together with Agility Robotics, can really crack the nut on a walking robot that can deliver packages, then they won’t even need to solve the autonomous vehicle problem. They’ll have the whole world beating down their door.

Udacity’s Sensor Fusion Nanodegree Program!

Udacity’s Sensor Fusion Nanodegree Program launched yesterday! I am so happy to get this one out to students 😁

Goal

The goal of this program is to offer a much deeper dive into perception and sensor fusion than we were able to do in our core Self-Driving Car Engineer Nanodegree Program. This is a great option for students who want to develop super-advanced, cutting-edge skills for working with lidar, camera, and radar data, and fusing that data together.

The first three months of the program are brand new content and projects that we’ve never taught before. The final month, on Kalman filters, comes from our core Self-Driving Car Nanodegree Program. The course is designed to last four months for new students. Students who have already graduated the core Self-Driving Car Engineer Nanodegree Program should be able to finish this specialized Sensor Fusion Nanodegree Program in about three months.

Curriculum

Course 1: Lidar
Instructor: Aaron Brown, Mercedes-Benz
Lesson: Introduction. View lidar point clouds with Point Cloud Library (PCL).
Lesson: Point Cloud Segmentation. Program the RANSAC algorithm to segment and remove the ground plane from a lidar point cloud.
Lesson: Clustering. Draw bounding boxes around objects (e.g. vehicles and pedestrians) by grouping points with Euclidean clustering and k-d trees.
Lesson: Real Point Cloud Data. Apply segmentation and clustering to data streaming from a lidar sensor on a real self-driving car.
Lesson: Lidar Obstacle Detection Project. Filter, segment, and cluster real lidar point cloud data to detect vehicles and other objects!

Course 2: Radar
Instructor: Abdullah Zaidi, Metawave
Lesson: Radar Principles. Measure an object’s range using the physical properties of radar.
Lesson: Range-Doppler Estimation. Perform a fast Fourier transform (FFT) on a frequency modulation continuous wave (FMCW) radar signal to create a Doppler map for object detection and velocity measurement.
Lesson: Clutter, CFAR, AoA. Filter noisy radar data in order to reduce both false positives and false negatives.
Lesson: Clustering and Tracking. Track a vehicle with the Automated Driving System Toolbox in MATLAB.
Lesson: Radar Target Generation and Detection Project. Design a radar system using FMCW, signal processing, FFT, and CFAR!

Course 3: Camera
Instructor: Andreas Haja, HAJA Consulting
Lesson: Computer Vision. Learn how cameras capture light to form images.
Lesson: Collision Detection. Design a system to measure the time to collision (TTC) with both lidar and camera sensors.
Lesson: Tracking Image Features. Identify key points in an image and track those points across successive images, using BRISK and SIFT, in order to measure velocity.
Project: 2D Feature Tracking. Compare key point detectors to track objects across images!
Lesson: Combining Camera and Lidar. Project lidar points backward onto a camera image in order to fuse sensor modalities. Perform neural network inference on the fused data in order to track a vehicle.
Lesson: Track An Object in 3D. Combine point cloud data, computer vision, and deep learning to track a moving vehicle and estimate time to collision!

Course 4: Kalman Filters
Instructors:
Dominic Nuss, Michael Maile, and Andrei Vatavu, Mercedes-Benz
Lesson: Sensors. Differentiate sensor modalities based on their strengths and weaknesses.
Lesson: Kalman Filters. Combine multiple sensor measurements using Kalman filters — a probabilistic tool for data fusion.
Lesson: Extended Kalman Filters. Build a Kalman filter pipeline that smoothes non-linear sensor measurements.
Lesson: Unscented Kalman Filters. Linearize data around multiple sigma points in order to fuse highly non-linear data.
Project: Tracking with an Unscented Kalman Filter. Track an object using both radar and lidar data, fused with an unscented Kalman filter!

Partners

One of the highlights of working at Udacity is partnering with world experts to teach complex skills to anybody in the world.

In this program we are fortunate to work especially closely with autonomous vehicle engineers from Mercedes-Benz. They appear throughout the Nanodegree Program, often as the primary instructors, and sometimes simply offering their expertise and context on any other topic.

MathWorks has also proven terrific partners by offering our students free educational licenses for MATLAB. The radar course in this program is taught primarily in MATLAB and leverages several of their newest and most advanced toolboxes.

Reflection

There is a quote, from a completely different context, “It took forever and then it took a night.”

That sums up how I felt building this Nanodegree Program. We spent over a year kicking around ideas for this program, starting work and stopping work, and there were times I thought it wasn’t going to happen. Then we got the right group of instructors together it came together faster than I ever imagined, and it’s beautiful.

Waymo Opens To The Public

Last week, Waymo announced it will put 10 of its vehicles on Lyft’s network in Phoenix. Any Lyft user will be able to ride.

This is super-duper exciting! Waymo is several years ahead of everybody else developing self-driving cars, but until now their vehicles have been off-limits to the general public. I see them scooting around Mountain View all the time, but the only way I can get a ride in one is to pull a favor from a friend who works there.

Now Waymos will be open, albeit in very small initial numbers, to anybody in Phoenix, via Lyft’s network.

This announcement also makes Phoenix the second place in the world, alongside Lyft’s partnership with Aptiv in Las Vegas, where a member of the general public can hail a self-driving robotaxi. They still come with safety drivers, but it’s nonetheless a big step forward.

NXP Earnings And The Future Of Automotive Manufacturing

I spent a few hours this morning racing down the rabbit hole of NXP’s Q1 2019 earnings call, which I wrote up for Forbes.com:

“The transcript highlights, in particular, the distinction between NXP’s traditional automotive semiconductor business, which declined, and its advanced driver assistance systems (ADAS) and battery management systems (BMS), both of which grew dramatically, albeit from small bases.”

NXP is kind of like the automotive industry in miniature: vehicle sales are declining today, causing decreases in revenue associated with traditional automotive manufacturing. But in the not-so-distant future, mobility will change and new products, like advanced driver assistance systems and battery management systems, will grow quickly.

Read the whole thing.

And I should also mention that my Forbes.com editor, Alan Ohnsman, has recruited a terrific stable of automotive writers. The daily output of the Forbes.com transportation section is voluminous. Just in the last day you can read about shadow testing at Tesla, Ford’s Q1 earnings, the effect of self-driving cars on the automotive repair market, Ford’s connected vehicles efforts, GM’s upcoming electric pickup truck, Tesla’s cash crunch, Ford’s investment in Rivian, and Waymo’s lidar units.

Self-Driving Car Ethics

My Udacity colleague Vienna Harvey sat down with Australian podcaster Zoe Eather to discuss the role of both ethics and education as they relate to self-driving cars. It’s a fun episode 🙂

This interview is part of Zoe’s Smart Community podcast, which covers everything from infrastructure, to data, to climate change, to mobility.

Prior to Vienna’s interview, I got to take Zoe for a spin in Carla, Udacity’s self-driving car. Zoe was delightful and I think you’ll enjoy listening to her and Vienna geek out about self-driving cars.

Waymo Goes (Just A Little) Big In Michigan

Waymo just announced that it has decided on a facility in Detroit to modify its fleet of Chrysler and Jaguar Land Rover vehicles into self-driving cars.

MarketWatch reports that Waymo will create 400 jobs at the site, which is meaningful, but also not game-changing. This seems primarily like an expansion of Waymo’s existing facility in nearby Novi, Michigan. The goal is probably to do the same type of work on more vehicles, not to fundamentally expand the scope of operation.

By all appearances, Waymo purchases what are essentially off-the-shelf Chrysler Pacifica and Jaguar I-PACE vehicles, and bring them to this facility to convert them into autonomous vehicles.

I might imagine there are a lot of similarities between the work Waymo does in Michigan and the work AutonomouStuff has been doing in Peoria, Illinois, for years. To become a self-driving car, an off-the-shelf vehicle needs augmented power supplies, new computers, a lot more sensors, and a substantial amount of wiring.

That takes a lot of work, especially if Waymo plans to do that for tens of thousands of vehicles.

However, Waymo does not appear to be building out a manufacturing plant to build the vehicles themselves. Maybe things will head in that direction eventually, but I’d bet not.

There has been a lot of speculation that the automotive industry will start to look something like the airline industry. Ridesharing companies will purchase vehicles from manufacturers Chrysler, the same way airlines purchase airplanes from manufacturers like Boeing. Then the ridesharing company or airlines outfits the vehicles or airplane to their specification. The latest Waymo news feels like a step in that direction.

Apple Lidar: Designed In California, Built…Somewhere

CNBC reports that Apple is in discussions with “at least four companies as possible suppliers for next-generation lidar sensors in self-driving cars.”

The report also suggests that, “The iPhone maker is setting a high bar with demands for a ‘revolutionary design.’…In addition to evaluating potential outside suppliers, Apple is believed to have its own internal lidar sensor under development.”

Waymo managed to pull off this trick with its Laser Bear Honeycomb lidar, designed in-house and the subject of pretty intense litigation with Uber.

If anything, Apple’s hardware design strengths should make this an even easier task for Apple than for Waymo, so it seems totally plausible Apple could pull this off.

The question is: to what end?

I know very little about why Waymo started designing its own lidar, but I know they started building self-driving cars with the Velodyne HDL-64 “chicken bucket” model.

My guess is that Google began developing their own lidar several years ago not because they needed a much better sensor, but rather because they couldn’t get enough sensors of any type.

Several years ago, when Google would have begun developing its lidar program, Velodyne was one of the only lidar manufacturers in the world. And even Velodyne was severely constrained in the number of units it could produce. There was a period a few years ago when the waiting list to buy a Velodyne lidar unit was months long.

In that world, it would have made a lot of sense for Google to begin developing its own lidar program. That would’ve reduced on possible bottleneck for building self-driving cars at scale.

Fast-forward to 2019. Velodyne has taken massive investment capital to build lidar factories, and there are upwards of sixty lidar companies (mostly startups) developing sensors. Today, there isn’t the same need or urgency to develop custom lidar units. In fact, all of those lidar startups are basically doing that on their own.

So it’s not totally clear to me what Apple would gain from creating their own lidar program.

Volkswagen Is Testing Real Driving Conditions In Hamburg

Volkswagen announced it is testing (present tense) self-driving cars in Hamburg. The press release details that there are five self-driving e-Golfs testing on a three kilometer stretch of road in Hamburg.

This would be a minor announcement in the US, where a number of different companies are testing fleets of this size (or bigger) within geofences of this size (or bigger). But surprisingly little testing has happened on public roads in Germany, so it is terrific to see Volkswagen take this step. This might actually be the first major test I can recall in that country.

That said, the press release is a little coy on the exact setup. While the scenario is described as “real driving conditions”, the test is also said to be taking place in a special autonomous vehicle “test bed” that is still under construction.

My sense is that this test is probably not on truly “public” roads that any regular driver might pass through. That said, it seems like a good precursor to that kind of test.

This is the first time Volkswagen has begun to test automated driving to Level 4 at real driving conditions in a major German city. From now, a fleet of five e-Golf, equipped with laser scanners, cameras, ultrasonic sensors and radars, will drive on a three-kilometer section of the digital test bed for automated and connected driving in the Hanseatic city.

The press release does have some interesting and specific details about the vehicles themselves:

“The e-Golf configured by Volkswagen Group Research have eleven laser scanners, seven radars and 14 cameras. Up to 5 gigabytes of data are communicated per minute during the regular test drives, each of which lasts several hours. Computing power equivalent to some 15 laptops is tucked away in the trunk of the e-Golf.”