âPerhaps radar is even underappreciated. Venture capital has flowed into lidar and camera-based solutions for automated vehicles; radar has been viewed as a commodity.â
That seems right to me.
The article highlights three companies working on different approaches to more advanced radar for self-driving cars. The work from Bosch to create radar-based high-definition maps seems particularly interesting.
âBy coupling these two inputs [radar and GPS], Boschâs system can take that real-time data and compare it to its base map, match patterns between the two, and determine its location with centimeter-level accuracy.â
Bosch calls this approach, âradar road signatureâ and posits that it can provide centimeter-level accuracy while using half as much data as a camera-based map.
Bosch is highlighting their work with TomTom to build radar-based HD maps. They divide these maps into three layers:
Localization: calculating where the car is in the world
Planning: deciding which actions are available to the car
Dynamic: predicting what other actors in the environment will do
This is exciting work because high-definition (HD) maps are usually the domain of lidar. Lidar point clouds are used to generate maps against which a vehicle can compare later sensor readings.
Some work has gone into attempts to build such maps with camera data. Visual SLAM is one example of this. By comparison, relatively little work has gone into building HD maps from radar data. That makes Boschâs endeavor novel and exciting.
Bosch is positioning this as a fleet-based mapping system, with map data generated by ordinary consumer cars, not necessarily specialized mapping vehicles. Itâs hard to know how realistic that really is, but it would play to Boschâs strength of scale.
âOne million vehicles will keep the high-resolution map up to date.â
As the worldâs largest automotive supplier, Bosch has a unique ability to pump a success into the automotive market.
Ford CTO Ken Washington, who used to be like my bossâs bossâs bossâs bossâs boss when I was at Ford, and seems like a great guy, has a post up about Digit, a humanoid robot that Ford is working on for last-mile deliveries.
Reading the post and watching the video, I have a few reactions:
This is awesome.
This will be insanely hard.
Giving a robot a lidar for a head is a stroke of genius, at least from an aesthetic perspective.
Ford is completely right that the last-mile (really, last-ten-yards) delivery problem is going to be a huge issue. Right now logistics companies rely on drivers to both operate a vehicle and to walk deliveries to customersâ front doors. Self-driving cars solve the first problem, but in a lot of cases that wonât ultimately have much of an impact if we canât solve the second task.
So the motivation for Digit is spot-on.
But walking robots are bananas-level difficult.
Look no further than this video with the awesome title, âA Compilation of Robots Falling Down at the DARPA Robotics Challengeâ:
Granted, this video is from 4 years ago and progress has been made, but my impression is that walking robots make self-driving cars look like an easy problem.
I remember taking an edX course from Russ Tedrake at MIT called âUnderactuated Roboticsâ that was concerned with, among other things, walking robots. This course was so, so hard. The control problems inherent in a multi-joint, walking robot are of a staggering level of mathematical complexity.
Digitâs demo video is awesome, but weâve all learned to be skeptical of demo videos. If Ford, together with Agility Robotics, can really crack the nut on a walking robot that can deliver packages, then they wonât even need to solve the autonomous vehicle problem. Theyâll have the whole world beating down their door.
The goal of this program is to offer a much deeper dive into perception and sensor fusion than we were able to do in our core Self-Driving Car Engineer Nanodegree Program. This is a great option for students who want to develop super-advanced, cutting-edge skills for working with lidar, camera, and radar data, and fusing that data together.
The first three months of the program are brand new content and projects that weâve never taught before. The final month, on Kalman filters, comes from our core Self-Driving Car Nanodegree Program. The course is designed to last four months for new students. Students who have already graduated the core Self-Driving Car Engineer Nanodegree Program should be able to finish this specialized Sensor Fusion Nanodegree Program in about three months.
Curriculum
Course 1: Lidar Instructor:Aaron Brown, Mercedes-Benz Lesson: Introduction. View lidar point clouds with Point Cloud Library (PCL). Lesson: Point Cloud Segmentation. Program the RANSAC algorithm to segment and remove the ground plane from a lidar point cloud. Lesson: Clustering. Draw bounding boxes around objects (e.g. vehicles and pedestrians) by grouping points with Euclidean clustering and k-d trees. Lesson: Real Point Cloud Data. Apply segmentation and clustering to data streaming from a lidar sensor on a real self-driving car. Lesson: Lidar Obstacle Detection Project. Filter, segment, and cluster real lidar point cloud data to detect vehicles and other objects!
Course 2: Radar Instructor:Abdullah Zaidi, Metawave Lesson: Radar Principles. Measure an objectâs range using the physical properties of radar. Lesson: Range-Doppler Estimation. Perform a fast Fourier transform (FFT) on a frequency modulation continuous wave (FMCW) radar signal to create a Doppler map for object detection and velocity measurement. Lesson: Clutter, CFAR, AoA. Filter noisy radar data in order to reduce both false positives and false negatives. Lesson: Clustering and Tracking. Track a vehicle with the Automated Driving System Toolbox in MATLAB. Lesson: Radar Target Generation and Detection Project. Design a radar system using FMCW, signal processing, FFT, and CFAR!
Course 3: Camera Instructor:Andreas Haja, HAJA Consulting Lesson: Computer Vision. Learn how cameras capture light to form images. Lesson: Collision Detection. Design a system to measure the time to collision (TTC) with both lidar and camera sensors. Lesson: Tracking Image Features. Identify key points in an image and track those points across successive images, using BRISK and SIFT, in order to measure velocity. Project: 2D Feature Tracking. Compare key point detectors to track objects across images! Lesson: Combining Camera and Lidar. Project lidar points backward onto a camera image in order to fuse sensor modalities. Perform neural network inference on the fused data in order to track a vehicle. Lesson: Track An Object in 3D. Combine point cloud data, computer vision, and deep learning to track a moving vehicle and estimate time to collision!
Course 4: Kalman Filters Instructors: Dominic Nuss, Michael Maile, and Andrei Vatavu, Mercedes-Benz Lesson: Sensors. Differentiate sensor modalities based on their strengths and weaknesses. Lesson: Kalman Filters. Combine multiple sensor measurements using Kalman filtersâââa probabilistic tool for data fusion. Lesson: Extended Kalman Filters. Build a Kalman filter pipeline that smoothes non-linear sensor measurements. Lesson: Unscented Kalman Filters. Linearize data around multiple sigma points in order to fuse highly non-linear data. Project: Tracking with an Unscented Kalman Filter. Track an object using both radar and lidar data, fused with an unscented Kalman filter!
Partners
One of the highlights of working at Udacity is partnering with world experts to teach complex skills to anybody in the world.
In this program we are fortunate to work especially closely with autonomous vehicle engineers from Mercedes-Benz. They appear throughout the Nanodegree Program, often as the primary instructors, and sometimes simply offering their expertise and context on any other topic.
MathWorks has also proven terrific partners by offering our students free educational licenses for MATLAB. The radar course in this program is taught primarily in MATLAB and leverages several of their newest and most advanced toolboxes.
That sums up how I felt building this Nanodegree Program. We spent over a year kicking around ideas for this program, starting work and stopping work, and there were times I thought it wasnât going to happen. Then we got the right group of instructors together it came together faster than I ever imagined, and itâs beautiful.
The discussion centered around how to generate training data for machine learning, and I spoke specifically about simulated data for self-driving cars.
This is super-duper exciting! Waymo is several years ahead of everybody else developing self-driving cars, but until now their vehicles have been off-limits to the general public. I see them scooting around Mountain View all the time, but the only way I can get a ride in one is to pull a favor from a friend who works there.
Now Waymos will be open, albeit in very small initial numbers, to anybody in Phoenix, via Lyftâs network.
This announcement also makes Phoenix the second place in the world, alongside Lyftâs partnership with Aptiv in Las Vegas, where a member of the general public can hail a self-driving robotaxi. They still come with safety drivers, but itâs nonetheless a big step forward.
âThe transcript highlights, in particular, the distinction between NXPâs traditional automotive semiconductor business, which declined, and its advanced driver assistance systems (ADAS) and battery management systems (BMS), both of which grew dramatically, albeit from small bases.â
NXP is kind of like the automotive industry in miniature: vehicle sales are declining today, causing decreases in revenue associated with traditional automotive manufacturing. But in the not-so-distant future, mobility will change and new products, like advanced driver assistance systems and battery management systems, will grow quickly.
My Udacity colleague Vienna Harvey sat down with Australian podcaster Zoe Eather to discuss the role of both ethics and education as they relate to self-driving cars. Itâs a fun episode đ
This interview is part of Zoeâs Smart Community podcast, which covers everything from infrastructure, to data, to climate change, to mobility.
Prior to Viennaâs interview, I got to take Zoe for a spin in Carla, Udacityâs self-driving car. Zoe was delightful and I think youâll enjoy listening to her and Vienna geek out about self-driving cars.
MarketWatch reports that Waymo will create 400 jobs at the site, which is meaningful, but also not game-changing. This seems primarily like an expansion of Waymoâs existing facility in nearby Novi, Michigan. The goal is probably to do the same type of work on more vehicles, not to fundamentally expand the scope of operation.
By all appearances, Waymo purchases what are essentially off-the-shelf Chrysler Pacifica and Jaguar I-PACE vehicles, and bring them to this facility to convert them into autonomous vehicles.
I might imagine there are a lot of similarities between the work Waymo does in Michigan and the work AutonomouStuff has been doing in Peoria, Illinois, for years. To become a self-driving car, an off-the-shelf vehicle needs augmented power supplies, new computers, a lot more sensors, and a substantial amount of wiring.
That takes a lot of work, especially if Waymo plans to do that for tens of thousands of vehicles.
However, Waymo does not appear to be building out a manufacturing plant to build the vehicles themselves. Maybe things will head in that direction eventually, but Iâd bet not.
There has been a lot of speculation that the automotive industry will start to look something like the airline industry. Ridesharing companies will purchase vehicles from manufacturers Chrysler, the same way airlines purchase airplanes from manufacturers like Boeing. Then the ridesharing company or airlines outfits the vehicles or airplane to their specification. The latest Waymo news feels like a step in that direction.
CNBC reports that Apple is in discussions with âat least four companies as possible suppliers for next-generation lidar sensors in self-driving cars.â
The report also suggests that, âThe iPhone maker is setting a high bar with demands for a ârevolutionary design.ââŚIn addition to evaluating potential outside suppliers, Apple is believed to have its own internal lidar sensor under development.â
If anything, Appleâs hardware design strengths should make this an even easier task for Apple than for Waymo, so it seems totally plausible Apple could pull this off.
The question is: to what end?
I know very little about why Waymo started designing its own lidar, but I know they started building self-driving cars with the Velodyne HDL-64 âchicken bucketâ model.
My guess is that Google began developing their own lidar several years ago not because they needed a much better sensor, but rather because they couldnât get enough sensors of any type.
Several years ago, when Google would have begun developing its lidar program, Velodyne was one of the only lidar manufacturers in the world. And even Velodyne was severely constrained in the number of units it could produce. There was a period a few years ago when the waiting list to buy a Velodyne lidar unit was months long.
In that world, it would have made a lot of sense for Google to begin developing its own lidar program. That wouldâve reduced on possible bottleneck for building self-driving cars at scale.
Fast-forward to 2019. Velodyne has taken massive investment capital to build lidar factories, and there are upwards of sixty lidar companies (mostly startups) developing sensors. Today, there isnât the same need or urgency to develop custom lidar units. In fact, all of those lidar startups are basically doing that on their own.
So itâs not totally clear to me what Apple would gain from creating their own lidar program.
Volkswagen announced it is testing (present tense) self-driving cars in Hamburg. The press release details that there are five self-driving e-Golfs testing on a three kilometer stretch of road in Hamburg.
This would be a minor announcement in the US, where a number of different companies are testing fleets of this size (or bigger) within geofences of this size (or bigger). But surprisingly little testing has happened on public roads in Germany, so it is terrific to see Volkswagen take this step. This might actually be the first major test I can recall in that country.
That said, the press release is a little coy on the exact setup. While the scenario is described as âreal driving conditionsâ, the test is also said to be taking place in a special autonomous vehicle âtest bedâ that is still under construction.
My sense is that this test is probably not on truly âpublicâ roads that any regular driver might pass through. That said, it seems like a good precursor to that kind of test.
This is the first time Volkswagen has begun to test automated driving to Level 4 at real driving conditions in a major German city. From now, a fleet of five e-Golf, equipped with laser scanners, cameras, ultrasonic sensors and radars, will drive on a three-kilometer section of the digital test bed for automated and connected driving in the Hanseatic city.
The press release does have some interesting and specific details about the vehicles themselves:
âThe e-Golf configured by Volkswagen Group Research have eleven laser scanners, seven radars and 14 cameras. Up to 5 gigabytes of data are communicated per minute during the regular test drives, each of which lasts several hours. Computing power equivalent to some 15 laptops is tucked away in the trunk of the e-Golf.â